• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
Sorry for not wanting to trawl through the last dozen pages, but have they revealed performance numbers (even if they're AMD's own numbers).

I've just started catching up so wanted to know if I'll eventually come across it in some of the media coverage.
Raster between 4090 and 4080, RT~3090
 
A very disappointing reveal for me. The rumours of 2X ray tracing performance over RDNA2 were what got my hopes up, and that fell short.

I found it disturbing that AMD focused on DP 2.1 and the ability to attain stupid 300-600 FPS in 'competitive games' like Fortnite.

Suddenly the 4K/120 limitation of the 4090 looks like no problem at all, as it's a card that can actually fill that bracket with ray traced goodness.

Huhh ? You mean like nvidia did with DLSS 3 and fake frame doubling that increases latency and creates frames with artifacts and frames that look nothing like the previous or next frame ? :rolleyes:
 
its a divergence.. at the moment the 2 cards cant be compared because youd get a completely different picture depending on what you choose
- pure raster
- raster + dlss/fsr
- raster +rt
- raster + rt +dlss/fsr

even dlss/fsr settings are not comparable given how the internet doesnt stop arguing about iq differences
my estimate is nvidia is a couple generations ahead of amd right now because they have chosen to move away from traditional raster performance, nothing was stopping them from filling up CUDA cores to the brim or using the die space for more ROPs

I'd argue it's closer to 2 generations now (independent of RT).

When you consider... the real big Ada chip is the one that hasn't been released yet... the 4090ti to Titan, whatever they call it.

This chip has 18,176 shader cores (the 4090 is the cut down variant at 16,384 shaders).

The 4080 16gb has 9,728 cores.

The 4080 is the 1/2 size chip... not even the 2/3 size chip, or the other ways they sometimes cut chip, depending on their basic design & separatable segments.

AMD just released a chip that's looking like it will *just* beat the 4080... the 1/2 size Nvidia chip... as their flagship 7900xtx.

That's 2 generations... or 1.5, if I'm more fair.

That's why Nvidia have been able to rebrand their smaller chips as bigger chips from he 6xx series... AMD are so far behind.
 
Gamers Nexus are pointing out that AMD's 8k isn't 16:9 but 32:9 i.e. 7680x2160 and half of proper 8k.

Gamers Nexus did a rush video where they read off the AMD slides where it said FSR and 8K Widescreen and then moaned about it being deceptive despite them literally reading it off the screen in the titles and regular sized lettering on the bars not small print.

Gotta make an attempt to make drama for clicks.
 
How have they regressed I might be wrong but are they charging more from previous generation yet still have increased performance? Correct me if I'm wrong
6900 XT was 58% of the performance of a 3090 in CP2077 4K+RT Ultra. The 7900 XTX will be 48% of the performance of a 4090. That's before we factor in DLSS which is superior both simple & with frame generation, before we think about the heavier modes already in the game (psycho) and on their way (overdrive), and before we settings tweak which will increase the gap further because Nvidia can scale even better in RT with a little knowledge.

Yes, the 7900 XTX is clearly cheaper but actually it's worse value, because it has half the performance! More importantly if you look at absolute numbers instead of relative, the 7900 XTX will still be firmly <30 fps which is NUTS! And it has way more deficiencies than that but honestly, who cares, this is just a poor showing from AMD overall. No way anyone buying a $1000+ GPU in 2022 is ignoring RT so their raster numbers vs $ won't save them, at best it puts them at parity with 4090 for $/fps but without any of the features.

Honestly even Vega was a better GPU vs Pascal than this is vs its competition.
 
6900 XT was 58% of the performance of a 3090 in CP2077 4K+RT Ultra. The 7900 XTX will be 48% of the performance of a 4090. That's before we factor in DLSS which is superior both simple & with frame generation, before we think about the heavier modes already in the game (psycho) and on their way (overdrive), and before we settings tweak which will increase the gap further because Nvidia can scale even better in RT with a little knowledge.

Yes, the 7900 XTX is clearly cheaper but actually it's worse value, because it has half the performance! More importantly if you look at absolute numbers instead of relative, the 7900 XTX will still be firmly <30 fps which is NUTS! And it has way more deficiencies than that but honestly, who cares, this is just a poor showing from AMD overall. No way anyone buying a $1000+ GPU in 2022 is ignoring RT so their raster numbers vs $ won't save them, at best it puts them at parity with 4090 for $/fps but without any of the features.

Honestly even Vega was a better GPU vs Pascal than this is vs its competition.

I ain't looking at this tier of card and pricing I'm looking at the lower stack where most of the market is

I have 3080 Fe £650 and Nvidia wants £1269 for the 4080 16gb yeah get lost no chance

I'll be looking with interest at the 7800xt if worth uplift over the 3080 if not guess I'll be sticking with the 3080 for a while
 
6900 XT was 58% of the performance of a 3090 in CP2077 4K+RT Ultra. The 7900 XTX will be 48% of the performance of a 4090. That's before we factor in DLSS which is superior both simple & with frame generation, before we think about the heavier modes already in the game (psycho) and on their way (overdrive), and before we settings tweak which will increase the gap further because Nvidia can scale even better in RT with a little knowledge.

Yes, the 7900 XTX is clearly cheaper but actually it's worse value, because it has half the performance! More importantly if you look at absolute numbers instead of relative, the 7900 XTX will still be firmly <30 fps which is NUTS! And it has way more deficiencies than that but honestly, who cares, this is just a poor showing from AMD overall. No way anyone buying a $1000+ GPU in 2022 is ignoring RT so their raster numbers vs $ won't save them, at best it puts them at parity with 4090 for $/fps but without any of the features.

Honestly even Vega was a better GPU vs Pascal than this is vs its competition.

So basically another person who defended Nvidia jacking up prices massively this generation,but when AMD doesn't do it,they are overpriced and its their fault.

The same thing like what happened with Kepler or Turing.

Maybe if Nvidia actually priced its RTX4070 at RTX3070 prices,and didn't try to sell its RTX3060 replacement as a £1000 dGPU people then people might care.

AMD prices its top level dGPU at the same price as the prior one,and suddenly is AMD is dooomed,overpriced - well blame Nvidia for setting the bar so low,AMD can have a value $1000 dGPU. So literally defending the doubling of prices against the previous generation.

That means even the RTX4060 or RTX4060TI plebs like me will get offered,will be literally a RTX3050 replacement in drag,with rubbish RT performance to match. Plus we get even terribad rasterised performance too!

Looking at the FPS I get in Cyberpunk 2077 at qHD native with RT on,even 2X improvements won't help. I need to reduce the RT effects down and use DLSS2.0 upsampling/cheating to get stable FPS.

Plus with DLSS2.0 now finished for us Ampere owners,just need to get a new RTX4050....sorry RTX4060TI just to get a feature update.

RT has become the new tessellation. Except this time Nvidia is making sure the majority of gamers don't get the improvements we should get.

This is the same company which gave us the TI4200,6600GT,8800GT and GTX970. Now they pull tricks like the RTX4080 12GB and think gamers wouldn't notice.
 
Last edited:
I'm running an older ultrawide with a g-sync module, so stuck with Nvidia. But now I'm considering dumping my screen in the nearest skip and buying a new one to switch to AMD. I'll probably still save money upgrading from my 1080 Ti.
 
6900 XT was 58% of the performance of a 3090 in CP2077 4K+RT Ultra. The 7900 XTX will be 48% of the performance of a 4090. That's before we factor in DLSS which is superior both simple & with frame generation, before we think about the heavier modes already in the game (psycho) and on their way (overdrive), and before we settings tweak which will increase the gap further because Nvidia can scale even better in RT with a little knowledge.
To be fair CP "overdrive" is created with only one purpose : to boost Ada sales. It shouldn't be used as a perf metric, it will most likely have custom code that only runs on Ada and is using their new RT features. Not saying that Radeon perf isn't bad but on a game like CP 2077 it will be normal to keep regressing and would had happened anyway.
 
I haven't seen many talk about that rdna3 architecture did an Nvidia on the core count. They went for 2xfp32 units.

This means shader counts are not comparable with rdna2 without adjustment.

Leakers didn't know this, that's another reason why they thought rdna3 was going to be much faster than it actually is - because to them they saw the 7900xtx as a GPU with 12 thousand cores and the 6900xt as 4000 cores. The reality is that the 12 thousand number is actually doubled from the real number so yea you get double FP32 but that only helps compute performance you don't get the rest of the units required to fill pixels faster.

So when you remove that anomaly and compare apples to apples you find that the 7900xtx only has 50% more cores than the 6900xt, not 200% more like leakers had you believe.

Now we understand why the performance uplift isn't that big; rdna3 has 50% more cores, the clock speeds only increased by 15%. Take 50% + 15% = 65% and that's nearly a match for AMD's 70% performance uplift claim, so 5% for IPC
 
Last edited:
Resident evil 4k rt enabled max settings

7900xtx 138fps
4090 fe 175 fps
3090to 102fps


Seems pretty decent to me. Huge uplift over 6000 series, big improvement over ampere.

Sources are amd website and tech power up 4090 fe review

That's decent
Yet they kept the MSRP the same and still some saying that performance increase is poor from 6000 series lol
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom