• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Considering the 4090 allows you to up the PL to 600W, and it spiked at 666W, Nvidia did say when they ran cards in their lab at 700-800W they melted PSU’s and even the GPU’s themselves, I’d say there’s not a lot of scope for more power hungry cards.

Possibly faster VRAM and a slight power increase might enable them to squeeze in a 4090Ti but it would appear unable to kpull much more power than the 4090 does before causing problems.

When did they say they melted PSU’s? I must have missed that part. Why would 700-800w draw on a GPU melt a capable PSU? If it’s a 1200+ psu it should handle it fine. Just put two 12 pin connectors on the card, and say it has a 1500w psu requirement or some such nonsense. Only the lunatics will buy it anyway and get fleeced just before the next gen comes (sorry 3090Ti owners!).


The full AD102 has 18432 cores, so there’s still scope to bring out a ‘full fat’ consumer chip at some point. There’s just no way Nvidia has released its full hand now for this generation when they can milk the die hard fans twice in a single gen…
 
Last edited:
A lot of reviews I'm seeing are failing to account for bottlenecking when presenting their average percentage gain over 3090/3090ti. HU and TPU are two I've just been looking at in detail, and there are a whole bunch of games that are being CPU or engine bottlenecked at 4K, and yet they are just plugging those numbers into the average.

Don't believe me, go have a look at the chart? They're including games with 10-20% gain over 3090 that are engine or CPU bottlenecked at 4K, and including those in their calculation for average performance gain.

Point is they didn't expect to see this and haven't had time to adapt their methodology, which is completely broken when you have games hitting their non-GPU bottlenecks at 4K. Go look at the chart for Borderlands 3 at TPU for an example, but there are many.

So it's not 45% and not 60% over 3090 at all. It's more like 80-100% over when you exclude bottlenecked titles.

Pretty egregious oversight from these reviewers.
You are asking them to cherry pic results. Which is completely bias.
We need to see those results. Because at the end of the day there is no CPU bottleneck when you include all the common resolutions. Either the GPU does well at
1080p
1440p
4k
Or it doesn't. As we can clearly see the 4090 simply shines best at 4k. Claiming that it's a cpu bottleneck is a copout. They should work on their drivers better to alleviate the bottleneck if it's not a hardware limitation.
Because you forget that whenever you involve the cpu you are actually implementing the graphic's card driver. As this is where the instructions are given to tell the cpu what to do.

Therefore, in hindsight you are actually stating is a driver problem.
:cry:
 
Last edited:
Point is they didn't expect to see this and haven't had time to adapt their methodology, which is completely broken when you have games hitting their non-GPU bottlenecks at 4K. Go look at the chart for Borderlands 3 at TPU for an example, but there are many.

Pretty egregious oversight from these reviewers.

If they're running the highest power consumer CPUs currently available, it seems a fair reflection of the real world capability of the card. Consumers looking at buying these cards are also going to be limited by their CPUs.
 
Last edited:
If they're running the highest power consumer CPUs currently available, it seems a fair reflection of the real world capability of the card. Consumers looking at buying these cards are also going to be limited by their CPUs.
Not when boiling down to an average it isn't. It's a confounding variable. It's junk data. You're testing the GPU, not the CPU. And there are plenty of games that didn't bottleneck. Including them and then saying 'Overall performance of this GPU is 60% better than previous gen' is just a lie. It's not even remotely accurate. If you're a reviewer and you're testing the top speed of a car you don't do it with an 80 mph speed limiter engaged. It's meaningless.
 
Last edited:
Of course its overkill.

This card is for 4K/120fps + gamers and 4K triple screen gamers.

Everyone else buying them are probably wasting money when they should be upgrading a different component of their rig if coming from a 3080+ card.

Guess I'll upgrade the display then, the LG C2 does 4k 120 and I happen to have one unused. Not that I have to justify it but I have been PC-less for 2 years. So this one I want to last me.
 



850watt Corsair psu right? Hmm, well Steve at GN who used a EVGA 1600W T2. Now I know they aren't playing CB in the same location.
But when you have Steve from GN showing @ 1440p roughly 145 FPS with DLSS and Ultra/RT while Steve at HUB only showing 145 FPS without RT (113fps with rt/dlss). Power is a factor in those benchmark results regardless of what area they decide to benchmark in.

That 4 pig tail adapter seems to used for load balancing. And, you do need all 4 of them to boost the frequency. Using only 3 results at baseline boost. Or, not having enough power IE: transient starved appears to do the same thing, lower boost frequencies.

But if you don't want to upgrade your psu and claim all is well (because it will work...you just wont get the best performance) by all means stick with the 850w HUB recommendation. LOL to those using 450-650w. But hey, anything is possible, right?
:cry:

Although I am not a real fan of GN. I do respect their power/temp analysis and benchmark results 150% over HUB. But hey, that's just me amiright? :p
Those are at different resolutions.
 



850watt Corsair psu right? Hmm, well Steve at GN who used a EVGA 1600W T2. Now I know they aren't playing CB in the same location.
But when you have Steve from GN showing @ 1440p roughly 145 FPS with DLSS and Ultra/RT while Steve at HUB only showing 145 FPS without RT (113fps with rt/dlss). Power is a factor in those benchmark results regardless of what area they decide to benchmark in.

That 4 pig tail adapter seems to used for load balancing. And, you do need all 4 of them to boost the frequency. Using only 3 results at baseline boost. Or, not having enough power IE: transient starved appears to do the same thing, lower boost frequencies.

But if you don't want to upgrade your psu and claim all is well (because it will work...you just wont get the best performance) by all means stick with the 850w HUB recommendation. LOL to those using 450-650w. But hey, anything is possible, right?
:cry:

Although I am not a real fan of GN. I do respect their power/temp analysis and benchmark results 150% over HUB. But hey, that's just me amiright? :p
In those screenshots isn’t one 1440p and the other 1080p in the settings above the charts?
 
Not when boiling down to an average it isn't. You're testing the GPU, not the CPU. And there are plenty of games that didn't bottleneck. Including them and then saying 'Overall performance of this GPU is 60% better than previous gen' is just a lie. It's not even remotely accurate. If you're a reviewer and you're testing the top speed of a car you don't do it with an 80 mph speed limiter engaged. It's meaningless.

When you test a CPU you do it with a high-end motherboard and good memory. You don't argue it's meaningless to test it because in the future there will be faster memory released even though that will make it run faster. The theoretical limits of the 4090 are irrelevant, what matters is how fast they will work for you in the games you want to play. Now, if they were slapping these cards into a system with a 10400 or something then, sure, no-one's going to blow two grand on a card and pair it with something like that, but if they take the GPU and put it in a high end setup then you're testing the real world performance of the card in real games.
 
When you test a CPU you do it with a high-end motherboard and good memory. You don't argue it's meaningless to test it because in the future there will be faster memory released even though that will make it run faster. The theoretical limits of the 4090 are irrelevant, what matters is how fast they will work for you in the games you want to play. Now, if they were slapping these cards into a system with a 10400 or something then, sure, no-one's going to blow two grand on a card and pair it with something like that, but if they take the GPU and put it in a high end setup then you're testing the real world performance of the card in real games.
Two different arguments these. I'm talking about boiling figures down to an average. Take the car analogy. You have ten cars and you want to work out the average top speed of all ten. But five of them have a speed limiter engaged. You cannot then say, 'The average speed of these ten cars is 130mph' or whatever it comes to. There is junk data contributing to that number. I'm not arguing that they shouldn't be testing these titles and showing the 4090 hits their CPU or engine limits (and it is engine limits in some cases, which is even worse). I'm arguing that the number output as an average is junk.

I'm a data scientist. For a living. That's my job. I produce global reports that heavily leverage averages for organisations like the UK government, the World Economic Forum, many others. And I'm telling you that a methodology that presents an average with hidden confounds would be laughed out of the room in my workplace. It's junk. Worthless. The methodology has to exclude it from said average and it doesn't.
 
Last edited:
Back
Top Bottom