• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Speculation - AMD isn't going to be able to offer really competitive PC (4K) or console performance until they have GPUs with >100 Compute Units

Looks like the next generation of AM5 Ryzen CPUs will use a variant of TSMC's 3nm process (either N3E or N3P). Link:
https://www.pcgameshardware.de/CPU-...yzen-8000-Granite-Ridge-Desktop-CPUs-1419743/

The fact that AMD are doing this makes it much more likely that they will use a 3nm process for RDNA4

So, I think we will see the compute units scaling a fair bit beyond 100 CUs. At the moment, power consumption is ~350w, with 96 Compute Units on RDNA3, but I think RDNA4 power improvements will allow then to scale their designs significantly higher, and the performance gap between Nvidia's and AMD's flagship cards will narrow further.

104 Compute Units ought to be possible for the next console releases (e.g PS5 Pro) imo.

One thing I'm unsure about, is could the die size be a limiting factor, for an RDNA4 3nm console GPU?

The Xbox Series X GPU is 360 mm² (fabbed on 7nm).
 
Last edited:
According to MLID only Zen5 server products will be in 3nm, while desktop stays on 5nm, 3nm is too expensive for consumer products except for Apple since their customers are happy to pay $2k for a device
 
Last edited:
RDNA 4 is gonna be pretty crap if they don't use 3nm.

They've already hit a wall with the CU count on RDNA3.

MLID is a source of rumours, not actual information.
 
Last edited:
3nm won't matter that much. If you do the math between it and N4 it's a small difference compared to the hill they have to climb vis-a-vis tensor & RT cores. Without solutions for those they could be using 1nm and it would still be a flop.
 
I disagree, they've already made large improvements to RT performance with RDNA3, the performance is not that much behind the RTX 4000 series.

I also think that to most gamers, rasterization is more important than ray tracing. RT is not considered to be necessary to most gamers, but smooth frates are.

Additionally, frame generation (via FSR3) can help AMD to achieve decent frame rates, when RT is enabled at 1440p and 4K.

They need a much improved fabrication process in order to reduce the high power consumption of RDNA3, so that they can scale the number of compute units beyond 100.
 
Last edited:
I disagree, they've already made large improvements to RT performance with RDNA3, the performance is not that much behind the RTX 4000 series.

I also think that to most gamers, rasterization is more important than ray tracing. RT is not considered to be necessary, but to most gamers, smooth frates are.

They need a much improved fabrication process in order to reduce the high power consumption of RDNA3, so that they can scale the number of compute units beyond 100.

I'd pay 50 quid more for a Nvidia Card over AMD for RT that's all the value I put on it. (Assuming Rasta is equal)

Rasta performance is far more important to me.
 
Well, £550 is a lot for a console.

I wonder how many would be willing to pay for a console with an MSRP of £600?
People pay £1000+ for a 7000XTX with it's 96 CU's, so a whole console with a 100+ CU's for £600 is good value and I think a lot of people would scoop it up (it would probably appeal to a lot of frustrated PC gamers who can't afford the crazy prices AMD and Nvidia are charging for new GPU's).
 
It would be tbh (especially compared to currently available PC graphics cards), but I'm not sure the console market would view it that way.

4K 60 FPS is the holy grail imo. Frame gen and technologies like FSR can allow even higher framerates, once this baseline has been achieved on consoles.

Some members of the PCMR will argue it's 120 FPS for PCs, but we are a while off from 4K 120 FPS at native resolution.

Consoles seem good value considering you can download recent games via services like Game Pass for a small monthly fee / or via free trials. But they make most of their money from selling new games, ofc.

I'd be tempted if I wasn't planning to upgrade my PC already, but RDNA4 is still >1 year away (should be Q4 2024).
 
Last edited:
I disagree, they've already made large improvements to RT performance with RDNA3, the performance is not that much behind the RTX 4000 series.
This is not really true. Had it nicely summed up in some imgs but imgur **** the bed & I can't link 'em, so for now you'll have to visit TPU yourself.

Look at % hit. 7900 XTX vs 6900 XT, almost identical throughout or <5% reduction (the % hit when turning on RT). Now look at 4090 vs 3090, almost >5% across the board. And pay attention across all resolutions and particularly for heavy RT titles, though amusingly in FC6 7900 XTX actually notably regresses compared to 6900 XT even at 4K where the CPU bottleneck isn't so harsh anymore. Where performance is needed the most, ADA delivers and keeps minimising the performance penalty, but RDNA 3 fails to do so even though it has a higher penalty to begin with so in theory it should have made more progress (because they're further away from diminishing returns territory). And this is without even bringing in pathtracing into the mix...
I also think that to most gamers, rasterization is more important than ray tracing. RT is not considered to be necessary to most gamers, but smooth frates are.
This is also not accurate. One, because Nvidia vastly outsells AMD. Two, we know from devs on consoles (in this cases 70% of) people purposefully go to turn to 30 fps mode for better IQ than keep 60 fps, even when it defaults to 60 fps mode, in a shooter (https://youtu.be/es0kf8OCJ9g?t=6318).
Additionally, frame generation (via FSR3) can help AMD to achieve decent frame rates, when RT is enabled at 1440p and 4K.
FSR 3 can't help AMD with anything because it only exists in their imagination for now. You can't tell me what a technology that doesn't exist will help with when you don't know how it performs.
They need a much improved fabrication process in order to reduce the high power consumption of RDNA3, so that they can scale the number of compute units beyond 100.
They need a lot of things but performance first of all. If they can't compete with Nvidia in performance then no one cares what their efficiency is.
 
Two, we know from devs on consoles (in this cases 70% of) people purposefully go to turn to 30 fps mode for better IQ than keep 60 fps, even when it defaults to 60 fps mode, in a shooter (https://youtu.be/es0kf8OCJ9g?t=6318).
That's not the same as choosing between RT on and RT off.

They are specifically talking about adjusting the resolution, at the expense of framerate. There are cases where I might consider doing that too, but probably not in a first person shooter.

Nvidia has been outselling AMD since well before the RTX 2000 series, when RT was introduced.

At 4K with RT enabled, the RX 7900 XTX is 'just' 16% behind the RTX 4080, and keeps up with the RTX 3090 TI.

The RX 7900 XT offers similar RT performance at 4K to the RTX 3090.

relative-performance-rt_3840-2160.png


Sure, you can buy a RTX 4090 and get ~55% better performance with RT on, but AMD isn't competing with a card that has an MSRP of £1,579.

As for FSR3 (which they have confirmed is definitely coming), I think they will get similar results to Nvidia's frame generation.

Scaling up the CU count beyond RDNA3 I think makes sense, because that will mean a scaling up of RT cores also.
 
Last edited:
That's not the same as choosing between RT on and RT off.

They are specifically talking about adjusting the resolution, at the expense of framerate. There are cases where I might consider doing that too, but probably not in a first person shooter.

Nvidia has been outselling AMD since well before the RTX 2000 series, when RT was introduced.

At 4K with RT enabled, the RX 7900 XTX is 'just' 16% behind the RTX 4080, and keeps up with the RTX 3090 TI.

The RX 7900 XT offers similar RT performance at 4K to the RTX 3090.

relative-performance-rt_3840-2160.png


Sure, you can buy a RTX 4090 and get ~55% better performance with RT on, but AMD isn't competing with a card that has an MSRP of £1,579.

As for FSR3 (which they have confirmed is definitely coming), I think they will get similar results to Nvidia's frame generation.

Scaling up the CU count beyond RDNA3 I think makes sense, because that will mean a scaling up of RT cores also.
dropping to 30 fps for better iq? ******* gross, why not just sync them to blinking for even better performance instead of making the games run well and at a decent fps
 
I disagree, they've already made large improvements to RT performance with RDNA3, the performance is not that much behind the RTX 4000 series.
According to the chart you posted:

6900XT -> 7900XT = 34% generational uplift
3090 -> 4090 = 75% generational uplift

even if you go 6900XT (should be 6950XT) -> 7900XTX it's still only 56%

AMD isn't closing the gap - it's falling further behind.
 
Last edited:
You have the option to buy an RTX 4090, if you want one. Is AMD falling behind though, when hardly anyone will buy one?

At the moment, ~0.4% of gamers (according to Steam), have one installed in their PC:

around 0.5% of gamers have a RTX 3090 installed (presumably less for the TI which isn't listed), so the numbers for the highest end cards are pretty small.
 
Last edited:
I'd like AMD to be competitive - I've used their cards before and been very satisfied with them but Nvidia has more compelling hardware and software right now (pricing notwithstanding). I really disliked Nvidia locking DLSS3 to Lovelace (and don't believe that it wasn't possible to enable it on Ampere) and would happily make my next GPU an AMD one *if* they were competing in hardware/software - but they're just not - they're not even great value - just 'better value' than their competition.
 
Last edited:
Back
Top Bottom