• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Higher Hz screen causing GPU temp increase?

yeah my luck is also terrible
i have no real choice but to laugh at it i use to get down about it but now ahhhh well im still alive
theres a lot of people who cant say the same [well they cant say anythign............ ] so yeah what cant kill us can only make us laugh
and cry at the same time lol
I can relate to that dude, so fair play, that's the best way to look at it, the Pro's still outweigh the cons. Chin up and all that mate :)
 
Back to the OP.
I'd start with monitoring clocks. It might even be that 2D at 240Hz is okay, 300Hz is not then you can decide whether that is worth it.

While running at 300Hz makes your GPU consume more - pressuambly mostlyl due to higer VRAM clocks - sometimes you override this.

That is, create a new 2D profile with a lower VRAM clocks. Finding a lower VRAM clock for 2D/desktop use is certainly a matter of trial-and-error as too low may lead to artifacts but often the driver team set it too high. So 2K/4K @ 60Hz might be 150Mhz or, there might be 2nd profile for - for example 2K @ 120Hz - which pushes VRAM to 300Mhz, but often after the 2nd or 3rd "desktop 2D" profile, the driver team will run VRAM at the max default.

Been a long time since I tried this - think it was in the days of Radeon HD7950 which had crazy "idle" clocks - so unsure whether the drivers can do or you need Afterburner or similar. I think the current AMD drivers might be able, unsure about the Nvidia ones. Might be too much effort to save 10W or so but making things quieter is always nice IMO.

Not many places test this kind of thing. I though computerbase did but I could not find it their 5070 Ti review so maybe I mis-remembered or they no longer do. But even then, I cannot see any place doing much more than the basic - TPU only do single and multi-monitor.

Take that back I did find it in the CB review:
So their dual monitor does show 2 x 4K @ 60Hz and 2 x 4K at 60Hz + 144Hz. The 5070 Ti does pretty well there "only" increasing by 10W for the later.
 
aha!! so you are using a 2nd screen?
Does it still do it with JUST the higher refresh monitor?
i think youv got the problem i was trying to describe above that tetras then kindly better explained for me
its actually not a real problem for me i got the 2nd gpu to "fix" it more as an excuse to do something slightly less usual with my pc jsut as an interest type project but still i think you have the issue i have that i was trying to explain but
instead i derailed the thread a bit
 
Not too sure, it’s only running a degree or two higher so not something I’m majorly concerned about if it is because the two monitors at different resolutions and Hz.
 
There isnt really any mystery here.

Higher refresh rate
Higher resolution
2 or more monitors
Higher bitrate
HDR
Etc....

All increase the amount of pixels being pushed and in turn increase demand on thr GPU. This has been the case since forever. The same behaviour has occurred on all my cards in multi monitor configurations for the last 20 years.

If your card is getting stuck at max clocks then that's a seperate issue/bug.
 
Higher resolution and or refresh rates can result in the GPU staying at a higher power state yes - 300Hz probably means it can't enter the lowest power state and still drive the monitor.

Just to elaborate on this for completeness. There's broadly 2 types of behaviour.

1) Frame limited - This can be caused by an app having vsync on or a max fps set in games. This limit will change upwards with a faster refresh monitor typically. Windows tends to keep everything vsynced. Higher refresh means higher target FPS so more GPU load and higher temps (very slight)

2) Frame unlimited - The GPU always runs as fast as it can to generate new frames, it will run as fast as it can in both instances. You wont see temp differences.

GPU cooling is designed to handle the card at 100% load, so don't worry about it and enjoy your new monitor :)
 
There isnt really any mystery here.

Higher refresh rate
Higher resolution
2 or more monitors
Higher bitrate
HDR
Etc....

All increase the amount of pixels being pushed and in turn increase demand on thr GPU. This has been the case since forever. The same behaviour has occurred on all my cards in multi monitor configurations for the last 20 years.

If your card is getting stuck at max clocks then that's a seperate issue/bug.

no no. its clocking down fine..

Just to elaborate on this for completeness. There's broadly 2 types of behaviour.

1) Frame limited - This can be caused by an app having vsync on or a max fps set in games. This limit will change upwards with a faster refresh monitor typically. Windows tends to keep everything vsynced. Higher refresh means higher target FPS so more GPU load and higher temps (very slight)

2) Frame unlimited - The GPU always runs as fast as it can to generate new frames, it will run as fast as it can in both instances. You wont see temp differences.

GPU cooling is designed to handle the card at 100% load, so don't worry about it and enjoy your new monitor :)

thank you, so far the monitor looks really good, only played BF6 yet, but looks gorgeous.
 
Hey guys,

Just finished setting up a new smaller monitor as I was having an issue with my widescreen, so went for a 27" 1440p 300Hz monitor.

My GPU usage while doing the basic stuff, is a few degrees higher (nothing major) than the 144Hz widescreen I had previously, can the jump up in monitor Hz cause the GPU to work a little harder or is it placebo for me?

Higher refresh rates absolutely increase power draw.

You can test this yourself. Run a game uncapped, check the power draw, and then cap it to 60fps.

For me running a game at 116fps (120hz tv with reflex enabled) vs 60fps can save as much as 100w at times.

Using dlss, fsr, framegen etc also reduce power consumption.
 
Last edited:
Higher refresh rates absolutely increase power draw.

You can test this yourself. Run a game uncapped, check the power draw, and then cap it to 60fps.

For me running a game at 116fps (120hz tv with reflex enabled) vs 60fps can save as much as 100w at times.

Using dlss, fsr, framegen etc also reduce power consumption.

Yeah, I run BF6 with DLSS on quality mode, at 1440p high settings and get around 180-200 fps while with my GPU undervolt peak around 61c and a power draw of around 200W. That’s on a 5070ti, love this card!
 
Back
Top Bottom