• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Blackwell gpus

It is at 100% performance at all times if I uncap the framerate, I run all games at 100-120fps if they are fast paced, otherwise it's 80-90fps like Control, Horizon etc. I want high performance but silent gaming, this is exactly what the 4090 has afforded me. There is no reasonable benefit running a 3rd person game at say 160fps and drawing more watts and generating more heat and noise when 100fps gets the exact same experience but with none of those traits. Only a 4090 can deliver that hence why efficient high end gaming is what it's all about now.
 
Last edited:
All the time? I run my Strix 4090 underclocked/300W in some older games for efficiency. Though for new AAA games, you need 100% of the 4090 performance and even then it can struggle at 4k.

More often than not I'm running my Strix overclocked where it uses 500-600W, as obviously I get more performance compared to stock or undervolted. Sure I could turn down the graphics, but I didn't buy a 4090 to do this :D
The 4090 has very steep diminishing returns towards the top end of its power curve, it needs the last 100W to get the last 3% of performance or something. You can cap it at 350W and barely notice the difference :p
 
Last edited:
Yes when i used to power cap instead of undervolt I found even as low as an 85% power limit in MSIAB resulted in a mere 4fps drop average, which means the nominal fps drop was about 2fps. It is way overspecced on power draw than it needs to be, but because of that you have a lot of headroom to cap away and lose basically nothing. The plus of undervolting instead is you gain a few fps (lock to the boost clock as it stays there at all times) not lose, whilst reducing power draw, so just power limiting becomes obsolete. Oh yeah you can freely up the VRAM by 1.1GHz as every 4090 will do that without breaking a sweat.

I expect the 5090 will be even more efficient in this area and draw 4090 UV power whilst still getting whatever % gains it has over the 4090 at stock.
 
Last edited:
It is at 100% performance at all times if I uncap the framerate, I run all games at 100-120fps if they are fast paced, otherwise it's 80-90fps like Control, Horizon etc. I want high performance but silent gaming, this is exactly what the 4090 has afforded me. There is no reasonable benefit running a 3rd person game at say 160fps and drawing more watts and generating more heat and noise when 100fps gets the exact same experience but with none of those traits. Only a 4090 can deliver that hence why efficient high end gaming is what it's all about now.

Eh? The 4090 is not powerful enough to get 100+FPS in all games at 4k.

If you want more than stock 4090 performance, you have to overclock and pump up the voltage, which of course increases wattage. Even with my 4090 Strix at 500-600W, I stil can't break 100FPS in some games, hence why the 5090 can't arrive soon enough

Limiting 4090 to 250W is great for efficiency, but it's going to be slower than a stock 4090 in 4k in the latest games.
 
The 4090 has very steep diminishing returns towards the top end of its power curve, it needs the last 100W to get the last 3% of performance or something. You can cap it at 350W and barely notice the difference :p

Silicon quality affects this also - some cards just have higher wastage than others. But yes overall, you get limiting returns when overclocking and exceeding 500W.

Though every frame counts when you're aiming to play 4k at > 60FPS or close to 100FPS. If I was concerned about electricity usage, I'd have not bought the most expensive gaming GPU on earth :cry:
 
Eh? The 4090 is not powerful enough to get 100+FPS in all games at 4k.

If you want more than stock 4090 performance, you have to overclock and pump up the voltage, which of course increases wattage. Even with my 4090 Strix at 500-600W, I stil can't break 100FPS in some games, hence why the 5090 can't arrive soon enough

Limiting 4090 to 250W is great for efficiency, but it's going to be slower than a stock 4090 in 4k in the latest games.

I believe he runs a 3440x1440 monitor and then uses the dlss or whatever it is. It's not 'proper' 4K
 
Huh? I've got both 4K QD-OLED and ultrawide QD-OLED. I game on the 32" 4K.

Silicon quality affects this also - some cards just have higher wastage than others. But yes overall, you get limiting returns when overclocking and exceeding 500W.

Though every frame counts when you're aiming to play 4k at > 60FPS or close to 100FPS. If I was concerned about electricity usage, I'd have not bought the most expensive gaming GPU on earth :cry:
It can get 100fps in basically every game, some with frame gen, some with dlss, some with native. I play at 4K yes. you do not need to pump up the voltage. I know because it's literally how I've had the card running from new.

Re-read what I posted, it has nothing to do with electricity usage.

Also watch any of my gameplay videos showing RTSS or any of my screenshots in various game threads.

I'm not one of the "it has to be native 4K or else" numpty crowd :cry:
 
Last edited:
Though every frame counts when you're aiming to play 4k at > 60FPS or close to 100FPS. If I was concerned about electricity usage, I'd have not bought the most expensive gaming GPU on earth :cry:
Are we counting DLSS as 4K here? Because the only game that's ever taken me under 60FPS was Indiana Jones with path tracing. Most stuff hits my 117FPS target.
 
Last edited:
Would be nice for Dave to give some examples, how much extra FPS for how much extra power ?
kF3ryPu.png


From der8auer's 4090 video back in the day. Starts at the 14:40 mark if you want to watch.

3fps going from 80% to 100% and an extra 7fps going to 130% :cry:
 
Last edited:
Basically exactly as I said :p

I apologise I wasn't aware of that. I thought you were just using that upscale thingy that I'm clearly familiar with the name of. The thing you set the factors for and then you can game at a res higher than native.

How come you use that though? I thought it was all about ultrawide master race :p

I actually put my C2 in 3840x1600 the other day and yeah still don't get it.
 
Last edited:
I apologise I wasn't aware of that. I thought you were just using that upscale thingy that I'm clearly familiar with the name of. The thing you set the factors for and then you can game at a res higher than native.

How come you use that though? I thought it was all about ultrawide master race :p

I actually put my C2 in 3840x1600 the other day and yeah still don't get it.

Dude, even I upgraded to the same monitor he has after selling my ultrawide Alienware. Been quite some time since he did.

As for the ultrawide master race, it was hilarious when he went for the 4K 16:9 after all his comments for years.

The same will happen with his 4090 when he gets a 5090. Hilarious stuff, Inhave my popcorn ready :D
 
Dude, even I upgraded to the same monitor he has after selling my ultrawide Alienware. Been quite some time since he did.

As for the ultrawide master race, it was hilarious when he went for the 4K 16:9 after all his comments for years.

The same will happen with his 4090 when he gets a 5090. Hilarious stuff, Inhave my popcorn ready :D

I see it's part of the long game. Get a 4k monitor and then need a 5090.

And yes I remember all the drum banging about how ultrawide was the best and the flip flop has happened and now the new thing is the best :p

Like clockwork :p
 
At the time remember my comments were based on what was available then, no 240Hz OLED 4K panel existed, so when it did exist, I upgraded and it all made sense.

I apologise I wasn't aware of that. I thought you were just using that upscale thingy that I'm clearly familiar with the name of. The thing you set the factors for and then you can game at a res higher than native.

How come you use that though? I thought it was all about ultrawide master race :p

I actually put my C2 in 3840x1600 the other day and yeah still don't get it.
DLDSR yeah, I used it in a few games back when I was using the ultrawide, obviously no need on native 4K panel though and now all my gaming is done on that because it's a gen 3 panel and has some pros over gen 1 QD-OLED as found on all 34" ultrawides today, such as 240Hz refresh, better panel characteristics, slightly tighter colour calibration from factory and the biggest of all for me, no annoying fan noise which the DW has.

Also a 4K output just looks nicer with the extra vertical height I've realised so will only go back to ultrawide when the 40" panels are out as they will retain the height of a 32" 16:9 panel whilst having the benefit of the extra width of 21:9.

Ultrawide is still better for immersion and multitasking, but it has to be the right size for the res, which is now 5120x2160 for 21:9 at 39 or 40", anything lower is not sufficient.
 
I only play HDR in games that have good HDR implementation built in, to date I can only think of 2 maybe 3 that have this, everything else is either too over exposed on highlights or not enough black detail or something just doesn't feel right. I don't bother with RTX HDR or Windows Auto HDR as no game I have tried it on has produced nice enough results to warrant using. Some people find them fine, but I want absolute picture quality/accuracy, so if that means playing SDR in games with HDR that is lacking, then yeah that's what I prefer.
 
Back
Top Bottom