• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
then naturally you expect the price to be not quite as fast..
This is the 'new' AMD remember

image.png
 
RDNA3 rumour on billibilli https://m.bilibili.com/space/4139209?spm_id_from=444.42.0.0

They're saying AMD's full unveiling of RDNA3 will take place on 3 November, two models will be announced and stock arrives in early December. For performance, they say neither card compete with the 4090 in raster or RT

Take with salt especially as these are generalised statements that don't reflect what we know of the 4090, that it's stupidly slow at 1080p so there is almost no chance rdna3 is slower than a 4090 at 1080p

Unless AMD have done worse than the 50% perf/watt they have publicly announced the math does not work out.

1.5x 6950XT performance is 89% of the 4090 (Using techspot/HUB figures because their performance delta over the 6950/6900 is between TPU on the low end and ComputerBase on the high end)

Using the techspot review numbers scaling TDP and perf/watt on both the 6900XT and 6950XT to a 375W N31 has it matching the 4090 in raster.

There is also the specs. The last time AMD increased shader count by 2.5x they increased performance by 56% but that was with effectively the same clock speed (20Mhz regression) and no ROP increase so was kinda similar to what AMD seem to be doing this go around (small WGP bump vs N21 but huge shader count bump) except AMD are also increasing clockspeeds by a substantial amount.

EDIT: Kinda reminds me of the rumours prior to RDNA2 launching where the leakers were saying 2080Ti / 3070 tier performance at best and the maths was saying 2x 5700XT at 4K and well, AMD delivered 2x 5700XT @ 4K because that is what 300W with a 50% perf/watt gain calculated to.
 
Last edited:
Yeah they have constantly stated +50% this gen like rdna2 before it did also. If it exceeds 50% then the performance will still be really good. As long as they dont copy nonsense versions like the 4060Ti gimped edition for £900, a 7800 non xt if they make it will be the same value prospect as was the 6800 non xt.
 
Depends how these things are calculated I suppose.

For example Nvidia claimed +100% increase in perf/watt for RTX4000 but based on reviews it looks more like 60-70% unless you power limit the GPU to 200w cause at 200w it shows up as 100% faster
 
Last edited:
AMD should forget about Nvidias unrealistic pricing and instead focus on delivering better price to performance over RDNA2.

A 50% performance boost to each of the new SKUs over their old RDNA2 equivalent parts with no more than a 20% price increase would prove to be a winner IMO.
 
Its all about what your getting for your money. If you are tied to the nvidia ecosystem only if AMD smashes it out of the park will people's heads turn (which we know basically wont happen).

For the majority of punters they will just pay the difference to stick with DLSS, RT, and any proprietary features. Or should we say also the neutrals that want AMD to be competitive so their nvidia purchases could be cheaper (which again we know won't happen so its flawed make believe).
 
Last edited:
Unless AMD have done worse than the 50% perf/watt they have publicly announced the math does not work out.

1.5x 6950XT performance is 89% of the 4090 (Using techspot/HUB figures because their performance delta over the 6950/6900 is between TPU on the low end and ComputerBase on the high end)

Using the techspot review numbers scaling TDP and perf/watt on both the 6900XT and 6950XT to a 375W N31 has it matching the 4090 in raster.

There is also the specs. The last time AMD increased shader count by 2.5x they increased performance by 56% but that was with effectively the same clock speed (20Mhz regression) and no ROP increase so was kinda similar to what AMD seem to be doing this go around (small WGP bump vs N21 but huge shader count bump) except AMD are also increasing clockspeeds by a substantial amount.

EDIT: Kinda reminds me of the rumours prior to RDNA2 launching where the leakers were saying 2080Ti / 3070 tier performance at best and the maths was saying 2x 5700XT at 4K and well, AMD delivered 2x 5700XT @ 4K because that is what 300W with a 50% perf/watt gain calculated to.
I can believe in this kind of performance uplift with using previous performance facts and using offically announed performance targets, i think you may be on the money with your deductions

I can see the 7900xt being competitive with the 4090 in raster, though i don't know about ray tracing.

With the rumoured costs only being slightly more than RDNA2 and that it's still much less than Ada rumoured costs - Maybe they could come in at say $1199, which is $100 more than the 6950xt and the same price as the 4080 16GB

Then if they reveal the 7800xt they could position that at the $899 to come in at the same price as 4080 12GB but with performance that may be above the 4080 16GB
 
Last edited:
Stands to reason though given the flagship RTX card uses a die that's about 20% bigger then the next gen Radeon card (T.B.C).

Not necessarily, RDNA cards have tended to have smaller dies anyway and are still competitive outside of Ray tracing with rtx3000 so just cause a rtx4090 has a 20% larger die doesn't mean it will be faster - in fact when looking at the Core layout diagram, the RT and tensor cores take up a significant portion of space on Nvidia dies and if they weren't there then die sizes between amd and Nvidia would be similiar

 
Last edited:
Status
Not open for further replies.
Back
Top Bottom