• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

As i was stating even before reviews dropped today if anyone on here was planning on upgrading to 5090 to game at resolutions lower than 4k it will be a less worthwhile upgrade coming from a 4090, but if you are gaming at 4k and even those that have 8k monitors and 8k tvs which gonna become a lot more common place now since the new hdmi and display port specifications allow for these resolutions at higher hz now then the 5090 is gonna make a lot of sense for future proofing,also have to take account the newer game engines to come and big hitter games like gta6 and new witcher on the unreal engine will only benefit even more from the newer card, yes it may not seem like a massive upgrade right now on current titles but when those new games drop and on older cards you are having to drop settings and turn features off just to have a playable experience then you will see the true benefit of a more powerful card like the 5090
The 6090 will out be out ahead of the Witcher 4 and GTA 6 PC launches! Maybe the 7090 in the case of Witcher 4. :D
 
Hardware unboxed average frame rate increase with 17 games over the 4090 at 1440p was 12%, at 4K this more than doubled to 27%. So yes, anything below 4K for these class of cards is pointless, it may as well be a CPU benchmark.

17 game average, 5090 over a 4090

At 1080p - a 2fps improvement, so identical.
At 1440p - 12% faster
At 2160P - 27% faster

Techtesters benchmarked 45 games and what I found particularly interesting is in 5 of the games tested, the RTX5090 was 5% to 20% slower than the RTX4090 at 1080p.
The 5090 was never slower at 1440p and 4k, but this GPU does not like 1080p for some reason.

Reminds me of a race cars that behave weird at low revs and low speed, the 5090 wants to go hard and push lots of cores and pull lots of power all the time, but it stutters when you're going slow
 
Last edited:
The 6090 will out be out ahead of the Witcher 4 and GTA 6 PC launches! Maybe the 7090 in the case of Witcher 4. :D
the way game release dates have been as late this wouldnt surprise me at all lol but hopefully you get what i mean the more demanding the new games are going forward with the extra features of the 5090 such as the faster vram and wider bus and core counts it is gonna cope with those situations a lot better than the previous gens will
 
Techtesters benchmarked 45 games and what I found particularly interesting is in 5 of the games tested, the RTX5090 was 5% to 20% slower than the RTX4090 at 1080p.
The 5090 was never slower at 1440p and 4k, but this GPU does not like 1080p for some reason. Reminds me of a race cars that behave weird at low revs and low speed, the 5090 wants to go hard and push lots of cores and pull lots of power, but it stutters when you're going slow
probably more of an optimization thing where it cant take advantage of all the cuda cores and uses more the clock frequency where the 4090 is a bit higher
 
The 5090 is really on the apex of the curve this time around.
From my AM4 build (5900x and 3090) ot would prob be an insane choice to do a while new build on what looks to be the first card taking a different direction with all the A.I. stuff.

I paid £2,150 in covid for the 3090 and I feel the 50 series will loose a lot when the 60 is out.
It would seem the 4090 crew got a great buy and reminds me of the 1080 and i5 2500k era.

I think my best move would be to try and stumble on a 4090. However, with them melting and having issues with connectors, is it worth the risk on a 2nd hand 4090?

I wonder if it's stick or possibly a 5080 as a last stop gap before a full build with the 60 series.

Amen...thanks for reading my waffle
 
So the 6090 will be £2500 I take it. Where does it end? :) I remember when GPUs were £200 max and you still had money left for toffee!

A bit like when I could pop down the local post office and get a bag of sweets for a threepenny bit, those days are long gone! :P
 
Finally getting to watch some of these review vids. On LTT they’re suggesting that DLSS4 MFG isn’t adding any significant latency over native when reflex is enabled… or did I hear that wrong?
 
Finally getting to watch some of these review vids. On LTT they’re suggesting that DLSS4 MFG isn’t adding any significant latency over native when reflex is enabled… or did I hear that wrong?

All the serious review sites are ignoring MFG completely in their reviews, I suggest you forget the joke that is LTT and try one of those.
 
Undervolting will be your friend this generation

This review tested undervolting in one game, so more needed please but results are quite decent

RTX5090 default power profile: 133fps and 575w used

Manual undervolt 0.970v: 133fps, 490w used

Manual undervolt 0.900v: 125fps 410w used

Manual undervolt 0.875v: 115fps 350w used


So at least in this one game, you can cut power consumption by 85watts and lose no performance and if you're happy to lose 5% performance you can cut power by 165watts. But remember with the 0.9v undervolt, the performance lost now means the 5090 is only 25% faster than the 4090

 
Last edited:
All the serious review sites are ignoring MFG completely in their reviews, I suggest you forget the joke that is LTT and try one of those.
this is mostly down to games not having updates out to make use of this feature yet and some can be enabled through the nvidia app just as a method but dont think reviewers want to test something thats not widely supported or released as of yet as these results could vary greatly come patch day
 
this is mostly down to games not having updates out to make use of this feature yet and some can be enabled through the nvidia app just as a method but dont think reviewers want to test something thats not widely supported or released as of yet as these results could vary greatly come patch day

Majority of people don't have 240Hz+ monitors so MFG 4X would be a useless feature unless you want to use a 40fps base rate on a 165Hz monitor. I personally don't even use 2X frame gen because it feels too floaty. Some people don't feel the latency at all though.
 
Last edited:
I just skimmed the video with no sound, and noticed three things

1) The subtitle for the first section of the video is called "the phenomenal founder edition card" like wtf, the thing is the new Fermi, its hot and power hungry. How can you call this card great when the GTX480 got bagged for the same thing

2) The first benchmarks in the video is with MFG 4x, 400% more fake frames, yay!

3) When they get to the real results, the 5090 looks like a 4090ti lol

DF are known mouth pieces for Nvidia. They don't hide it anymore to be fair.
 
Finally getting to watch some of these review vids. On LTT they’re suggesting that DLSS4 MFG isn’t adding any significant latency over native when reflex is enabled… or did I hear that wrong?

Yes Optimum and Daniel Owen were both impressed enough, even with the visual artifacts to say they'd use it for single player games over a lower frame rate.
 
The FE card is a bit noisy while gaming, so techpowerup lowered the fan speed slightly to a more tolerable 35db which is a standard fan noise level

Guess the temps?

85c on the GPU core
96c on the memory
100c+ on the hotspot? Who knows, Nvidia blocks the hotspot sensor on the 5090 from the user haha


Funny enough these temps are almost exactly what I predicted a few weeks back, just by applying some estimation based on the TDP and cooling solution. I expected around 83c on the core, so its actually slightly hotter than I expected

And remember, these temps are inside an air conditioned room. If you test the card in the middle of Summer with no AC, add another 10c to these numbers

Is the TPU test open bench like GN or in a case? Proper testing would be inside a case like the end customer would be using.

As for memory temps, gddr6 has a max temp of 105C according to the datasheet but micron don't even show the max temp for gddr7. At over 100C pretty sure it will start throttling.
 
Back
Top Bottom