• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA3 unveiling event

Doing the math the performance will be about 25% less than a 4090 with RT being about half the performance (roughly equal to 30 series in RT)
It's 25-30% less power hungry (not half the power usage lol)... So that puts efficiency about the same
What math did you do to get 25% less performance than a 4090? According to this 13 game average at 4K the 4090 scores 145 fps compared to 85 fps for the 6950 XT. If the 7950 XTX is 50% faster that would be 127.5 fps. 127.5 fps out of 145 fps is about 88%, so 12% slower than a 4090.
 
Loving this. it's a shot in the arm for the market and what AMD needed to pull out of the bag.
I like how all the accounts who are saying that the announcement is poor were created in the last two years...

I think its... a step of positivity in the right direction. I think. The problem is that it leaves that ultimate halo tier strictly the domain of nvidia and allows them to do what they will there. Maybe thats not the worst thing since it is a very small market there. I dont know. Just a bit sad that ultimate RT *and* Raster performance in a single product is actually pulling further away in segmentation, instead of being a competitive battle ground.
 
Last edited:
Well the RT slides the guy showed had some compelling RT games fps numbers, Cyberpunk with full RT at 4K for example was shown at 62fps vs the 42fps the RDNA2 card had. 4090 only gets over 100fps due to Frame Gen, what does a 4090 get with just DLSS2 at the same settings can anyone confirm? Either way it matters not to me, 3440x1440 QD-OLED 144Hz for a long time yet.

WcXdCKc.jpg

I'm actually quite interested in the XTX now, especially after seeing the mention of the tech partnership for Callisto Protocol. Obviously it would be silly to upgrade just for one game alone, so I want to see where else it excels once reviews come out. The power consumption, the no need to upgrade PSU from my Phanteks 750w AMP, retaining the dual 8 pin power connectors, the slimmmer slot profile in the case, the price (hopefully in the UK....).

But if my 3080 Ti runs games like Callisto at max settings with RT enabled at 100fps, then I have no need to want to upgrade for the best experience.


AMD should be congratulated, they have made a good leap in 4k RT performance

 
I think its... a step of positivity in the right direction. I think. The problem is that it leaves that ultimate halo tier strictly the domain of nvidia and allows them to do what they will there. Maybe thats not the worst thing since it is a very small market there. I dont know. Just a bit sad that ultimate RT *and* Raster performance in a single product is actually pulling further away in segmentation, instead of being a competitive battle ground.

Halo this gen is completely out of reach though. It's not even on the radar for most people
 
As suspected, a 4090 is not a worthwhile upgrade at 3440x1440 from a decent 30 series
what? it's literally the difference between playing at 60fps in current games and over 100fps more likely 144.

he gets less perf than I do in Cyberpunk w/ pyscho RT and no DLSS. he shows like 65fps but I get around 80

Maybe he's cpu limited or was video capturing on the same machine he was benching on.

lets face it games are going to get more demanding with the release of the 4000 series and RDNA3, I wonder how long until games are using way above the max ram of old gen cards now we have 16gb and 24 gb beasts
 

Is it just me or does this look more like a 7800xt?

Totally agree.

It's probably going to be more than 10% slower but cost 10% or so more.

Nvidia isn't the only one with stupid names and prices - anyone who can afford the 7900xt should just buy the XTX instead it will be better value.

It should be called 7800XT and it should be $799 not $899, until that happens the $999 7900XTX is the GPU to get from both a performance and value perspective
 
Last edited:
what? it's literally the difference between playing at 60fps in current games and over 100fps more likely 144.

he gets less perf than I do in Cyberpunk w/ pyscho RT and no DLSS. he shows like 65fps but I get around 80

Maybe he's cpu limited or was video capturing on the same machine he was benching on.

lets face it games are going to get more demanding with the release of the 4000 series and RDNA3, I wonder how long until games are using way above the max ram of old gen cards now we have 16gb and 24 gb beasts

He's using a 5950x which when paired with a RTx4090 even at 4k is behind the 5800x3d, which is behind zen4/13th Gen. So yes he would be bottlenecked at 4k and even more so at 1440p. It varies by game but Spider-Man is the worst one he tested with the 4090 sitting at 50-60% usage while the 3080 he's comparing it to is at 100% usage. The guy isn't a proper reviewer so I can't complain, people should just get their benchmarks from better sources than someone who's entire YouTube career is about reading other peoples news he finds online in a mundane voice.
 
Last edited:
The only mental gymnastics here are done by AMD fanboys… every single new product launch.

They didn’t even compare it to their previous gen. Just RT being not even 2x. That should tell you how much confidence they DON’T have in their product.

:) :) :)

On the other hand, they will bring out their own frame generation fsr ‘wannabe dlss3’ which is good in case you can’t switch to a 4xxx card right now. Might force Nvidia to get some kind of frame gen working on older gen cards too.
 
Last edited:
lets face it games are going to get more demanding with the release of the 4000 series and RDNA3,
Translation, devs are going to stick a bunch of toggles in the settings that have minimal impact to visual quality but will drop your FPS so PCMR doesn’t feel like they wasted their money. :D
 
What math did you do to get 25% less performance than a 4090? According to this 13 game average at 4K the 4090 scores 145 fps compared to 85 fps for the 6950 XT. If the 7950 XTX is 50% faster that would be 127.5 fps. 127.5 fps out of 145 fps is about 88%, so 12% slower than a 4090.
12% slower. Interesting number.

Doesn’t the extra 100w of the 4090 only gain it about 10% performance or something.

What will be really interesting is how performance scales for RDNA3 with overclocking. How much headroom will it have is a big unknown as well. Kind of wish AMD did similar something Similar to Nvidia and had AIBs showcasing their cards. If a few of them had 3 pin connectors on them, i think the hype would have gone through the roof.
 
Something is a bit off with this launch. Are they sandbagging? But you may ask, how?
Well, there were 2 things:
1. It was rumored that AMD lowered the specs of the Reference GPU. I didn't want to believe it but apparently they did. It's like the 5700xt all over again (mid range as a top end card). There was rumored a die with more 128 WGPs and higher cache, like 192MB. Is that perhaps the 7950xt? Or will the 7950 be just a higher clock version? That's not clear yet until Nv releases the Ti.
2. It's also rumored that RDNA 3 will scale performance with more power. AIBs from Sapphire to Asus are said to show a decent gap in performance from AMD's reference model do to the higher clocks.
 
Last edited:
Back
Top Bottom