• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GPU Power Draw at Increased Refresh Rates

You get the same increase in power usage if you plug in more than two monitors.

On AMD cards, you used to get the same thing with only two connected. Idle power consumption of the GPU would go from ~15 to 50+ watts. They did that for years and years, even as recently as the R9 290's. I haven't bothered to check if the 380's based on the same designed are still as utterly crap on idle, though I did hear the fury's had finally stopped doing it. Shame it took them so long to bother doing anything about it.
 
On AMD cards, you used to get the same thing with only two connected. Idle power consumption of the GPU would go from ~15 to 50+ watts. They did that for years and years, even as recently as the R9 290's. I haven't bothered to check if the 380's based on the same designed are still as utterly crap on idle, though I did hear the fury's had finally stopped doing it. Shame it took them so long to bother doing anything about it.

That's nothing compared to what the nVidia cards were doing with high refresh rates. I heard numerous cards hitting around 60 degrees on idle that's because they were 3d clocks not idle clocks.

But at-least these things have been fixed now.
 
Nvidia is working to fix G-Sync bug causing high power draw

Just an FYI for those who have a G-Sync monitor.

While running at 144Hz or higher, the site noticed that idle system power draw had moved from 76 watts to 134 watts, while at 165Hz, things rose further to 137.8 watts. While running at 60Hz, the system was only idling at 73.7 watts, so the increase in power draw was definitely noticeable. This also caused the GPU to idle while running at 885MHz, while running at 30 percent of its maximum TDP.

Nvidia has acknowledged the issue and is now working to fix the bug: “We checked into the observation you highlighted with the newest 165Hz G-SYNC monitors. Guess what? You were right! That new monitor (or you) exposed a bug in the way our GPU was managing clocks for GSYNC and very high refresh rates. As a result of your findings, we are fixing the bug which will lower the operating point of our GPUs back to the same power level for other displays.”

http://www.kitguru.net/components/g...ng-to-fix-g-sync-bug-causing-high-power-draw/
 
Nice find, I always run my Swift at 120Hz as I honestly can't tell the difference beyond that so doesn't really effect me as clocks stay low, In the 100MHZ - 300MHZ range when not in games but good to know Nvidia are looking into it :)
 
Same here but this bug has been around since 144Hz has become available. Surprised they have only just noticed it but maybe it was just a 144Hz 1440P issue?
 
Nvidia may not be my team but it's good to see a bug being reported and then fixed (or fix incoming.)

+1

Same here but this bug has been around since 144Hz has become available. Surprised they have only just noticed it but maybe it was just a 144Hz 1440P issue?

I noticed high idle clocks on my BenQ 1080P 144Hz monitor which is not G-Sync so it could be just 144Hz in general.
 
I run at 120hz and just set Highest Availible in the control panel per game.

Would be nice to have an official fix for it however.
 
Nice find, I always run my Swift at 120Hz as I honestly can't tell the difference beyond that so doesn't really effect me as clocks stay low, In the 100MHZ - 300MHZ range when not in games but good to know Nvidia are looking into it :)

I'll be amazed that anyone will be able to register a difference between 144hz and 165hz. That's only a 15% increase. Although I think the pcper guys claimed they could on the desktop.

Still....I'd like to experience it though :)
 
I'll be amazed that anyone will be able to register a difference between 144hz and 165hz. That's only a 15% increase. Although I think the pcper guys claimed they could on the desktop.

Still....I'd like to experience it though :)

In simplistic terms most people's ability to benefit from higher refresh rates i.e. for ingame reaction starts to taper off somewhere just after 80Hz with most seeing little actual benefit from higher rates somewhere around 120Hz.

There will be some situations though where for instance the update rate of something nicely fits as a multiple of a refresh rate which can be perceived as a better experience rather than the "judder" you can get sometimes when a frame is presented a little too early or a little too late compared to the average when it doesn't nicely fall on a frame boundary. Which could/can mean someone sees 165Hz as a benefit to 144 etc.

Personally I see little if any difference between 120Hz and 165Hz let alone 144-165 or 120-144 whereas 60-120 is massive.
 
Back
Top Bottom