• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

rtx4090 vs 7900xtx scaling test

Performance is measured in 25watt increments

The 7900xtx has better scaling towards the end of the curve and the gap between the cards starts to close but it's still a massive gap and a 300w rtx4090 is still faster than a 700w 7900xtx

 
rtx4090 vs 7900xtx scaling test

Performance is measured in 25watt increments

The 7900xtx has better scaling towards the end of the curve and the gap between the cards starts to close but it's still a massive gap and a 300w rtx4090 is still faster than a 700w 7900xtx

I
 
I just won't buy anything. RDNA3 is not good enough and everything outside of the 4090 is priced and named a tier too high. You could even argue the 4090 should be less cut down than it is but it does offer great performance at a fair price without any compromises.
I don't get all the defending of the RTX 4090 (when I find the value hideous) but I guess there's 2 types of buyer, one focused on value, one focused more on having the best performance and features.

You don't have to buy the RTX 4090 just because it's the flagship /biggest epeen card. That is just giving Nvidia a licence to charge more each generation for the top tier (and prices of the whole generation are pulled up).

I think the market is broken, because even the used flagship cards cost almost the same as new, you have to wait for the next gen for the used prices to drop to a more reasonable level. Buying used of course is probably the last thing AMD or Nvidia would like people to do though.
 
Last edited:
I don't get all the defending of the RTX 4090 (when I find the value hideous) but I guess there's 2 types of buyer, one focused on value, one focused more on having the best performance and features.

$1600 for the top card is not way OTT IMO. It is inline with prior Titans (although they did have a driver advantage for some workloads) or the SLi on a stick cards or simply buying 2x 980Ti or whatever the top card of the generation was when the dual GPU cards fell out of fashion. There has always been a product at this price point but with SLi no longer being a thing it is occupied by a huge single GPU.

rtx4090 vs 7900xtx scaling test

Performance is measured in 25watt increments

The 7900xtx has better scaling towards the end of the curve and the gap between the cards starts to close but it's still a massive gap and a 300w rtx4090 is still faster than a 700w 7900xtx

Interesting that the 7900XTX has pretty linear scaling to a point. Wonder if it backs off at the top end due to a lack of bandwidth since the ram is at 2800Mhz for all power levels.

Makes me think AMD intended for the XTX to be higher clocking with v-cache MCDs to make up the bandwidth but they just could not get it to clock that high in a sane power envelope so the XTX we have is probably what they planned to sell as the XT for $900.
 
Imagine if that had 4070 performance and 16gb vram (or even 12) for that money.

i bet they wouldnt be able to make enough of them.
I suppose they could do something like a 128-bit 4070 with 16GB. They could call it like a GS or... RS 4070 in homage to some of those old oddball cards with weird memory bus configs like the 8800 GS.

Either way it'd probably die at 4k but could probably be... Not more expensive than the regular 4070.
 
Last edited:
I just won't buy anything. RDNA3 is not good enough and everything outside of the 4090 is priced and named a tier too high. You could even argue the 4090 should be less cut down than it is but it does offer great performance at a fair price without any compromises.
Well, this and turing were the 2 biggest opportunities AMD had to send the leatherman back to the drawing board. It should have been a complete annihilation, but....they dropped the ball again. I feel more disappointed with AMD deciding basically not to compete than with nvidia that decided to up the whole midrange to the stratosphere.
 
AMD have been putting all there efforts in to the CPU market and ignored the GPU side but now they have a commanding position in the CPU market, and given Intel a good beating, they may decide to move more resources and attention to the GPU market, if for no other reason that to get in on the AI gold rush.
 
I don't get all the defending of the RTX 4090 (when I find the value hideous) but I guess there's 2 types of buyer, one focused on value, one focused more on having the best performance and features.
Because you can keep it until the $1.5k RTX 7060 comes out in six years time and save yourself a load of hassle. ;)

With 1% more money for 1% more performance that's how these things work now.
 
I thought the same but then it made me wonder what the real 4070 Ti performance would've been, in that chart the 4070 Ti (4080 12GB) is 28fps slower than the 4080 so the real 4070 Ti would've been even lower. :eek:
They wouldn't have released a 4070ti and just had the 4070 we have now maybe priced $50-100 higher.
 

exqJdh3.gif


What a fantastic video by digital foundry again, over to you Steve/HUB ;) :cool:
 
Last edited:
Back
Top Bottom