• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

7900 XTX: 119%
4080: 115%
7900 XT: 100%

The rumour always was between the 7900 XT and 7900 XTX, if the 9070 is 10% faster than the 7900 XT then its 95% a 4080 or 91% a 7900 XTX.
 
It needs to gain 38% over a 7800 XT to achieve that.

60 CU's vs 64 CU's +7%
2.4 Ghz vs 3.0 Ghz +25%
Memory Bandwidth 624 GB/s vs 650 GB/s +4%

Its possible, AMD's slide also said RDNA 4 is faster per shader.

RDNA 3 is at least 25% per shader faster than RDNA 2, the 6900 XT has 80 CU's vs 60 CU's on the 7800 XT (+33%) they are clocked almost identical and yet the 7800 XT is 97% the performance of the 6900 XT, 10% faster in RT.
 
Last edited:
Add it to the graph.

QTRsTJI.jpeg

Keep that graph going :D
 
I don’t see it matching a 4080s at the rumoured prices. I just don’t think AMD have got the message from the 7900 XT and XTX debacle that they can’t price like that.

Am I correct that Blackwell is on the 4NP process node and that this is not a full enhancement from the 4N node used for Lovelace? While RDNA 3 used N5 and RDNA 4 will use N4 node?

So this means Blackwell does not get the same benefits of a full node increase that the RDNA4 GPUs get?

Right, RDNA 3 was 5nm for the main die, it did have 6nm IMC / Cache chips but they are not relevant in this context, RDNA 4 will move to a purely monolithic 4nm node, the same node that the current RTX 4000 series is on, i don't know what Nvidia are doing other than the 5000 series is still on some form of 4nm, it could be a different / more enhanced 4nm node or the same as AMD.... ¯\_(ツ)_/¯
 
Last edited:
The 9070 was deliberately renamed to mimic Nvidia's line up, Frank Azor just said it, Nvidia's naming scheme is recognisable and AMD wants to be clear about what class of GPU people are looking at, so i'm guessing the 9070 is a ##70 class and the 9070 XT is a ##70 Ti.

The leaked performance is wrong, it is faster than that.

 
Last edited:
Yeah their naming scheme slide confirmed this was the case. A lot of people understandably mistakenly assumed it was a performance comparison.

I wonder if the next gen AMD will do a Roman Numeral (X for 10 hybrid) name. The X070, or just X70 for example?

Yes, that slide is nothing to do with positioning performance, its about positioning naming.

Also, FSR 4 is on the fly ML and RDNA 3 doesn't have the ML performance to do it, but they are looking at optimising it to try and get it working in some form on RDNA 3, the ML performance on RDNA 4 is massively higher than it is on RDNA 3, at CES they talked about Strix Halo having 2X the ML performance of the 4090, tho i'm sure that is very cherry picked.
 
Again, I agree that branding matters, but that's not the only thing that keeps people on iPhones. There can be a huge sunk cost with app store purchases, customization, accessories, etc.

Vendor lock-in is simply not a thing with graphics cards. There is absolutely no switching cost to pulling out an NV card and replacing it with an AMD one (or vice versa) in <5 minutes.

DLSS is a feature, not an ecosystem lock-in.



The German source below actually shows AMD cards outselling NV ones in Q1 2024 - take with a pinch of salt, as it's just based on one retailer who may have a vendor alignment (I know nothing about them). As @eeii and @NZXT30 said above, I suspect the real volume is in OEM desktops and laptops... we as PC builders are merely the sideshow :(

To answer your specific question for that time period, looks like it's roughly 45% share for the 4070+S vs. 55% for the 7800XT. I'm sure you could dig back historically to build up a full data set if you really wanted to :p

a8y6dU0.png



Its nice to see the RX 7800 XT doing well, its a damned good GPU, it deserves it.
 
I mean...

If we can have a 9070XT at 4070ti Levels in RT, and 4080 levels in Rasta at around the $500 mark AMD could be onto a real winner, depending on the final reviews for both companies.

That puts the 9700 XT about 15% ahead of the 7900 XTX, 4070 Ti RT in Cyberpunk is what we expected so.... as expected, if the price is not too high this is decent.

YRgP5JF.png


EuCuhsu.png
 
That black myth performance leak puts the 9070 XT around three FPS short of a 4080 super.

three FPS.

Right... Cyberpunk is worst case for AMD, anything else would be progressively better, Wokong is still very heavy on the RT but already now the 9070 XT is more inline with a 4080 S.
 
True but lets not get ahead of ourselves, 3fps behind a 4080 is pretty meaningless when neither game is "playable"

Kinda pointless even being 20% faster when the FPS is so low. Don't get me wrong, i'm looking forward to it, and i'm actually more excited for AMDs offering over NVidias at this time

No one is saying AMD should solve Nvidia performance problems in games like this, if they did that AMD performance would be very much higher, the point is the percentage difference, 3 FPS is 10% behind the 4080 Super in this heavy RT game, be that as it may it makes the "Nvidia RT" argument substantially less if not null, unless the 5070 is substantially faster in RT than the 4080 S, can you see that?
 
Last edited:
Anyone else think that AMD did ye old jebait and switch the BIOS at the last minute and that's why they didn't show of GPU's at CES? The performance might have been a surprise to Nvidia and that could be why RTX 5080 is delayed and they are swapping the final BIUOS on that card as well?

Just a thought :p

TBH no, AMD are so bad at this they jebait themselves....
 
Apparently 'jank' but only sufferable cos AMD cheaper.

I do agree there is probably some things that still play up from time to time, think people have issues with the presets not being saved in adrenaline.

Personally I don't use adrenaline for anything, I just do the settings in game, do people spend that much time in their GPU driver software?


I've come from 3 generations of Nvidia to AMD.

GTX 970
GTX 1070
RTX 2070 S

RX 7800 XT, based on that experience, no. AMD drivers have been impeccably behaved, not a single issue. Could not have asked for better.
 
Last edited:
Gaming and display is decent, but when you get to compute and to a lesser extent Linux, simply put AMD don't put enough resources into it. While most may experience "works for me", there are far too many instances where things break at no fault of the user. I gave up expecting rocm to ever be decent many years ago. But they traditionally have had much better memory bandwidth and fp64 compute than nvidia for the price. Like not even close. Battlemage also has high memory and compute specs for the price which is why I was so excited for that. Battlemage has teething issues from being new, AMD hardware still has teething issues from the software stack a decade later. AMD is great for CPU, it's great for the GPU hardware, it totally sucks at the GPU software.

A mate of mine is a 100% Linux user, Manjaro, he thinks, and he gets this from experience, Nvidia are garbage on Linux, he has ran all AMD on Linux for some time now, he's far from alone in that.
 
AMD Linux open source drivers are better than the Nvidia ones now - some of my mates who run Linux,now prefer the AMD cards.

Yeah.

Here is the thing, AMD are in no way perfect, there are things that one can pick them up on, sure, Nvidia are also not perfect, while their problems might be different in different areas they do have their own problems.

There-fore IMO its very disingenuous to use blanket statements like "AMD drivers bad" as if Nvidia's drivers are perfect, they aren't, but if you don't agree with the latter you're an AMD fanboy, no, AMD doesn't really have many' if any fans, there are plenty of people who will pay way over the odds for garbage tier GPU's on the Nvidia side tho, AMD can't get away with that.
 
Last edited:
Back
Top Bottom