• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 5000 SERIES

Wattage means nothing, I run the 4090 undervolted which means around 250 watts in games on average vs its stock 450 watts :p

All the time? I run my Strix 4090 underclocked/300W in some older games for efficiency. Though for new AAA games, you need 100% of the 4090 performance and even then it can struggle at 4k.

More often than not I'm running my Strix overclocked where it uses 500-600W, as obviously I get more performance compared to stock or undervolted. Sure I could turn down the graphics, but I didn't buy a 4090 to do this :D
 
It is at 100% performance at all times if I uncap the framerate, I run all games at 100-120fps if they are fast paced, otherwise it's 80-90fps like Control, Horizon etc. I want high performance but silent gaming, this is exactly what the 4090 has afforded me. There is no reasonable benefit running a 3rd person game at say 160fps and drawing more watts and generating more heat and noise when 100fps gets the exact same experience but with none of those traits. Only a 4090 can deliver that hence why efficient high end gaming is what it's all about now.
 
Last edited:
All the time? I run my Strix 4090 underclocked/300W in some older games for efficiency. Though for new AAA games, you need 100% of the 4090 performance and even then it can struggle at 4k.

More often than not I'm running my Strix overclocked where it uses 500-600W, as obviously I get more performance compared to stock or undervolted. Sure I could turn down the graphics, but I didn't buy a 4090 to do this :D
The 4090 has very steep diminishing returns towards the top end of its power curve, it needs the last 100W to get the last 3% of performance or something. You can cap it at 350W and barely notice the difference :p
 
Last edited:
Yes when i used to power cap instead of undervolt I found even as low as an 85% power limit in MSIAB resulted in a mere 4fps drop average, which means the nominal fps drop was about 2fps. It is way overspecced on power draw than it needs to be, but because of that you have a lot of headroom to cap away and lose basically nothing. The plus of undervolting instead is you gain a few fps (lock to the boost clock as it stays there at all times) not lose, whilst reducing power draw, so just power limiting becomes obsolete. Oh yeah you can freely up the VRAM by 1.1GHz as every 4090 will do that without breaking a sweat.

I expect the 5090 will be even more efficient in this area and draw 4090 UV power whilst still getting whatever % gains it has over the 4090 at stock.
 
Last edited:
It is at 100% performance at all times if I uncap the framerate, I run all games at 100-120fps if they are fast paced, otherwise it's 80-90fps like Control, Horizon etc. I want high performance but silent gaming, this is exactly what the 4090 has afforded me. There is no reasonable benefit running a 3rd person game at say 160fps and drawing more watts and generating more heat and noise when 100fps gets the exact same experience but with none of those traits. Only a 4090 can deliver that hence why efficient high end gaming is what it's all about now.

Eh? The 4090 is not powerful enough to get 100+FPS in all games at 4k.

If you want more than stock 4090 performance, you have to overclock and pump up the voltage, which of course increases wattage. Even with my 4090 Strix at 500-600W, I stil can't break 100FPS in some games, hence why the 5090 can't arrive soon enough

Limiting 4090 to 250W is great for efficiency, but it's going to be slower than a stock 4090 in 4k in the latest games.
 
The 4090 has very steep diminishing returns towards the top end of its power curve, it needs the last 100W to get the last 3% of performance or something. You can cap it at 350W and barely notice the difference :p

Silicon quality affects this also - some cards just have higher wastage than others. But yes overall, you get limiting returns when overclocking and exceeding 500W.

Though every frame counts when you're aiming to play 4k at > 60FPS or close to 100FPS. If I was concerned about electricity usage, I'd have not bought the most expensive gaming GPU on earth :cry:
 
Eh? The 4090 is not powerful enough to get 100+FPS in all games at 4k.

If you want more than stock 4090 performance, you have to overclock and pump up the voltage, which of course increases wattage. Even with my 4090 Strix at 500-600W, I stil can't break 100FPS in some games, hence why the 5090 can't arrive soon enough

Limiting 4090 to 250W is great for efficiency, but it's going to be slower than a stock 4090 in 4k in the latest games.

I believe he runs a 3440x1440 monitor and then uses the dlss or whatever it is. It's not 'proper' 4K
 
Huh? I've got both 4K QD-OLED and ultrawide QD-OLED. I game on the 32" 4K.

Silicon quality affects this also - some cards just have higher wastage than others. But yes overall, you get limiting returns when overclocking and exceeding 500W.

Though every frame counts when you're aiming to play 4k at > 60FPS or close to 100FPS. If I was concerned about electricity usage, I'd have not bought the most expensive gaming GPU on earth :cry:
It can get 100fps in basically every game, some with frame gen, some with dlss, some with native. I play at 4K yes. you do not need to pump up the voltage. I know because it's literally how I've had the card running from new.

Re-read what I posted, it has nothing to do with electricity usage.

Also watch any of my gameplay videos showing RTSS or any of my screenshots in various game threads.

I'm not one of the "it has to be native 4K or else" numpty crowd :cry:
 
Last edited:
Though every frame counts when you're aiming to play 4k at > 60FPS or close to 100FPS. If I was concerned about electricity usage, I'd have not bought the most expensive gaming GPU on earth :cry:
Are we counting DLSS as 4K here? Because the only game that's ever taken me under 60FPS was Indiana Jones with path tracing. Most stuff hits my 117FPS target.
 
Last edited:
Would be nice for Dave to give some examples, how much extra FPS for how much extra power ?
kF3ryPu.png


From der8auer's 4090 video back in the day. Starts at the 14:40 mark if you want to watch.

3fps going from 80% to 100% and an extra 7fps going to 130% :cry:
 
Last edited:
Back
Top Bottom