• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
AMD should just release the card ASAP without any fanfare or song and dance otherwise Nvidia will p on their parade if they have any warning at all.
I suspect Nvidia already has a card ready to mess up any launch as their Turing process is so mature but I hope AMD get a few weeks before any such launch
nevermind Ampere. Giving six months notice of a product release when you're the underdog seems naïve at best.
 
I was disappointed AMD didn't announce the new high end card. I'll make do with my 1070 until then. Unless a crazy good deal comes along.

I was also surprised about the lack of any news of Ray-tracing and the new hardware capabilities, except that new feature SmartShift.

But it is easy to explain because they would not like to negatively impact the sales of their current offerings.

Do not forget that nvidia has 7 or 8 cards which are faster than the RX 5700 XT.
 
Whatever happens Nvidia will want to release before the next gem consoles. If they come out and are less than say a 2070 super I can see a lot of people jumping ship
 
Whatever happens Nvidia will want to release before the next gem consoles. If they come out and are less than say a 2070 super I can see a lot of people jumping ship

Given where the prices go atm, NV & AMD would have to sell their flagships for sub £700 range. That won't happen due to yields & costs especially if NV goes for over 520mm2 chips. Cannot go smaller without removing RT & Tensor cores and maintain at least same performance for the "2080Ti" equivalent.
Yes could go for RT capable CUDA cores, which makes sense for the next year MCM GPUs (like AMD goes for everything integrated with RDNA2) but it would make all Turing RTX cards redundant with a single stroke. As nobody going to develop further for dedicated custom RT & Tensor cores from "past GPU" and not "future GPUs".

Which means next round of consoles going to be a bargain especially the top of Xbox one (seems going to be 2 of these). As it would guarantee 5y life circle of stable performance and visuals which will be better and more optimized as time goes by, in DX12 also.

Also everything is on the details. MS is targeting 8K 30fps and true 4K 60fps. Sony doesn't have target the "true 4K", so more likely could use AMD GPU scaling and RIS reducing the cost for having to manufacture something bigger than a RX 5700 (no XT)

<Sarcasm>
And already SONY started the marketing using "30% less power than competitor". Won't be surprised if they hire Greta to point fingers as bad American corporations using more electricity destroying the planet :D
</Sarcasm>

And that might explain the gap also between the 2 consoles.
 
When talking consoles I remember when they (Xbox one/PS4) said about 1080p/60fps now we all know the best they managed in most titles was 1080p/30fps or 720p/60fps, with the new consoles I'm sure some titles will run 4k/60fps but only the undemanding/ported older games, within a year or two you'll see 1080p/60fps being the goal with upscaling to 4k.

Even if Ampere isn't that great it'll still decimate anything consoles run, it's just the pricing that might be an issue, hopefully RTX 2080ti performance for £499 is possible, maybe if AMD can do something to give competition back into this market we'll start seeing massive gains again.
 
When talking consoles I remember when they (Xbox one/PS4) said about 1080p/60fps now we all know the best they managed in most titles was 1080p/30fps or 720p/60fps, with the new consoles I'm sure some titles will run 4k/60fps but only the undemanding/ported older games, within a year or two you'll see 1080p/60fps being the goal with upscaling to 4k.

Even if Ampere isn't that great it'll still decimate anything consoles run, it's just the pricing that might be an issue, hopefully RTX 2080ti performance for £499 is possible, maybe if AMD can do something to give competition back into this market we'll start seeing massive gains again.

RTX 2080 released at £750, its GTX 1080TI level performance, i very much doubt we will see RTX 2080TI level performance for £500 in the next generation of GPU's, this is Nvidia....
 
When talking consoles I remember when they (Xbox one/PS4) said about 1080p/60fps now we all know the best they managed in most titles was 1080p/30fps or 720p/60fps, with the new consoles I'm sure some titles will run 4k/60fps but only the undemanding/ported older games, within a year or two you'll see 1080p/60fps being the goal with upscaling to 4k.

Even if Ampere isn't that great it'll still decimate anything consoles run, it's just the pricing that might be an issue, hopefully RTX 2080ti performance for £499 is possible, maybe if AMD can do something to give competition back into this market we'll start seeing massive gains again.

PC hardware will always be faster than consoles, it's gauranteed like the sky will be blue, except in the UK where it's mostly grey - but blue like the sea ;)
 
RTX 2080 released at £750, its GTX 1080TI level performance, i very much doubt we will see RTX 2080TI level performance for £500 in the next generation of GPU's, this is Nvidia....
Very true but at this point AMD still hasn't had anything to compete with top end for years, if they manage to pull a Ryzen in the GPU sector you'll see prices freefall, if you said in 2015 you could get a 8 core CPU for under £100 on a £40 motherboard you'd laugh! While the R9 290x was a loud illegitimate child at least it was better than a 970 and able to get on par with the 980.
 
Very true but at this point AMD still hasn't had anything to compete with top end for years, if they manage to pull a Ryzen in the GPU sector you'll see prices freefall, if you said in 2015 you could get a 8 core CPU for under £100 on a £40 motherboard you'd laugh! While the R9 290x was a loud illegitimate child at least it was better than a 970 and able to get on par with the 980.

There are some discrepancies in the review 1 and review 2 results - R9 290X vs GTX 780.
Later, the R9 Fury X was on par with the fastest GeForce GTX 980 Ti.

edit: or the uber mode confused me a bit......

Radeon-R9-290-X-1.png

https://www.techpowerup.com/review/amd-r9-290x/27.html

Radeon-R9-290-X-2.png

https://www.techpowerup.com/review/nvidia-geforce-gtx-780-ti/27.html

R9-Fury-X-1.png

https://www.techpowerup.com/review/amd-r9-fury-x/31.html
 
Who cares, that's all history and a far reality from today's landscape. Even on 7nm the best AMD manage is the radeon vii and the 5700xt which is still slower than some 10 series cards. That's the sad reality hence the utter gouging by nvidia.
 
https://youtu.be/0G5oe-Cc994?t=251

What I'm saying is in some games, also later on (because nVidia doesn't support older cards so well with driver updates) the 290x was able to rival the 980 while being cheaper, it was priced to compete with the 970 (which the 290x beat easily) and it could get on par with the 980, not in every game about but pairing a 980 vs 290x now the 290x will come out on top more often than not because nVidia stopped supporting their older cards with performance updates.

The 290x was never a true high end card that could completely dominate on the flagship of nVidia but it was enough to give people the option of buying from a competitor without having to worry too much of performance loss, now with the 2080s/2080ti AMD has nothing to compete against them with, even the 5700 XT (ignoring the driver problems) struggles to get near the 2070s, which would've been beaten back in 2014 by the 290 for less.

The Fury was a whole different story, limited heavily by the 4GB VRAM, HBM being difficult for AMD to buy (also expensive) didn't help at all, not only those issues but some of those reference AIO coolers had issues with pump noise which put people off.

Uber mode was on the 2nd BIOS (IIRC) which meant the reference fan span higher too keep the card cool as those reference coolers were terrible, quiet mode was on the 1st BIOS (again IIRC) which made the fan run slower but also hit performance because it ran into that 94c limit :D
 
*
The 290x was never a true high end card that could completely dominate on the flagship of nVidia but it was enough to give people the option of buying from a competitor without having to worry too much of performance loss, now with the 2080s/2080ti AMD has nothing to compete against them with, even the 5700 XT (ignoring the driver problems) struggles to get near the 2070s, which would've been beaten back in 2014 by the 290 for less.
*

I'm sorry what? the 5700XT is right there nipping at the butt hairs of the 2070s and for a whole lot less money. HWUB Techspot had it just 2% slower at 1440p than the 2070s on average back in july. So struggles is a bit of a strong word imho. Even if forza isn't an instant win anymore for AMD.

5700XT_1.png

EDIT: Techspot not HWUB.
 
Techspots review of the 970 in 2019 was an interesting read. 1% slower than the R9 290(non x) over 33 games. Noted for it's more consistent performance and lower power requirements. If the 970 is 1% slower than a non x, where does that put it in relation to the 290x? 4 or 5%? That's not knocking on the door of the 980, 980's were around 30% faster than the 970s.

the 290x was able to rival the 980 while being cheaper, it was priced to compete with the 970 (which the 290x beat easily) and it could get on par with the 980, not in every game about but pairing a 980 vs 290x now the 290x will come out on top more often than not because nVidia stopped supporting their older cards with performance updates.

Dont forget the 290/290x came first and they were $500 cards originally. They weren't priced to match the 970 until the 970 was released and at the time performance was pretty close, so although AMD did drop the prices of the 290s, it was nVidia who forced AMD to drop their prices to that level.
 
The new high end really needs HDMI 2.1 and VRR support.

Remember the Fury X only having HDMI 1.4? And recommending using a displayport adapter to connect the card to their 4k 60hz TVs. An adapter which took a good 6+ months to come to market IIRC.
Then the Vega cards not having hdcp 2.2 support even though they had HDMI 2.0.
 
Who cares, that's all history and a far reality from today's landscape. Even on 7nm the best AMD manage is the radeon vii and the 5700xt which is still slower than some 10 series cards. That's the sad reality hence the utter gouging by nvidia.

I wouldn't say that's the best AMD can manage, that's just what they are willing to release while focusing more heavily on their cpu segment. Plus there's a long time on 7nm left for AMD to bring out much better cards.

The 5700XT is not far at all behind the 1080Ti either, pretty much negligible now.
 
Noteworthy that the only card of the three you can use in mGPU is the 1080ti. 1080ti SLI pi$$es all over all alternatives where supported for a far better price. Not surprised mGPU is no longer supported, it's the best way to get decent fps at a decent price hence the duopoly won't allow it. AMD/Nvidia (in no particular order) are having a laugh by stopping supporting the best value option in the GPU space. I'm running RDR2 all ultra except MSAA at 2560/1440 at 79fps. That's with 2 cards from March 2017! Both of mine steaming ahead at 1962/1949 core clock consistently in games.
 
Status
Not open for further replies.
Back
Top Bottom