• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is your graphics card inefficient or faulty? GPU idle power draw comparison

Associate
Joined
8 Jun 2025
Posts
62
Location
Manchester
Hey there! So, recently I had purchased an Asus Prime 5070 Ti, which had abnormally high idle power. I could not fix that regardless of what I did. I was just about to just accept it and live with it, until the GPU decided to try and burn my house down. Though I've since gotten a refund on that, and still waiting for a PSU warranty replacement, I thought I'd share a thread, so that we can compare the idle power draws and note any cards which behave abnormally.

To take part, all you have to do is post how much power your GPU is drawing while not doing any serious task, like gaming. Bonus points if you also include your monitor(s) resolution and refresh rate. This information can be obtained through the GPU software (nVidia app/AMD Adrenalin, through GPU-Z sensors "Board power draw", or through an app like HWInfo). Make sure to include the GPU brand and model.

Please keep in mind that having a higher idle is only of concern if it is consistently significantly higher than other examples of the same model. Some brands cards just take more than others and, while a little inefficient, this is completely normal and not a cause for any worries.

For reference, the faulty Asus Prime 5070 Ti was drawing between 35 and 40W at idle on a single 1440p 180hz monitor. Even dropping down to 1080p 60hz only shaved ~3W off that. In comparison, several friends that I know that had that exact same GPU, but which had no issues with it, reported idle power draws between 13 and 20W on that card with a single 1440p monitor, and about 20-24W with two 1440p monitors.

I've also asked around a bit and found several pieces of information:
- Gigabyte and Palit cards tend to have a higher idle this generation. Usually around 20-25 for a 5070 and 30 for a 5070 Ti.
- MSI cards either have great or very poor power draw. Some models are easily under 20 (5070/70 Ti/80 tier), while others burst above 40 at times.
- PNY cards generally have pretty good idle power at a 5070 (>20) and 70Ti (>25) level, but at a 5080 level they will go to about 35-40W - considerably higher than average.
- 5070s are usually quite efficient at idle
- 5080s (>18W) can even have a lower idle than 5070Tis (17-28W) for some reason
- 5090s are power hogs, and usually draw 30-50W at idle
- The AMD Radeon 9000 series cards are usually efficient, with >15W idles. Radeon 7000 are less efficient, and suffer from an issue where having multiple monitors with mismatches resolutions/refresh rates can cause terrible idle power (over 50, over 80, sometimes even reaching 100W).

A few GPU examples (each GPU is one example):
PNY 5070 - 7W
PNY 5070 - 12W
Asus Prime 5070 Ti - 20W single monitor, 23W multi-monitor (all 1440p)

Some information from various forums, discords etc:
  • 5090
    • PNY - 50W (multi-monitor; Linux)
  • 5080
    • PNY - 39W (multi-monitor)
    • PNY - 35W (multi-monitor)
    • MSI Ventus 3X OC Plus - 30W
    • MSI Shadow 3X - 31W (multi-monitor; first monitor 4k 120hz, second monitor 1440p 165hz)
  • 5070Ti
    • MSI Shadow 3X - 15W (single-monitor); 35W (multi-monitor; 3x 4k monitors)
    • MSI Gaming Trio - 45W
    • MSI Ventus 5070Ti - 18/19W

I'd be happy to hear some more examples
 
Have had HWmonitor open for a bit as I was monitoring CPU temps.... 7900XT, 1440p 165Hz and 4k 60Hz. Minimum 29W, hovering between 30-36 at the moment.
 
To the OP, I suspect we would get more useful data if you also asked for both GPU and memory clocks.
Both "idle" and video playback power usage has a lot to do with clocks. And various GPU's BIOSs and the respective drivers will have some kind of lookup table. Power and clocks for - as an example - 1080P @ 120Hz can be quite different than 1080P @ 144Hz if for the latter the driver and/GPU BIOS decide VRAM clocks have to go from 300MHz to 1000MHz. Sometimes VRAM clocks can be overridden but that can lead to glitches.
 
19 to 23 Watts, RX 7800 XT.

This is actual idle, having a browser open or even some apps is not idle as those use GPU acceleration, in which case you will see high 'idle' power, for me about 40 watts and the memory runs at between 1300 and 1500 Mhz, not the 45 Mhz you're seeing here.

I think you can change that by turning off GPU acceleration in those apps.

iorWviw.png
 
Last edited:
4090 FE
1440p 144hz
9w idle at desktop
That is a great idle for that tier of card! Congratulations!

Have had HWmonitor open for a bit as I was monitoring CPU temps.... 7900XT, 1440p 165Hz and 4k 60Hz. Minimum 29W, hovering between 30-36 at the moment.
Seems normal to me for that tier of card on double monitors with mismatched resolutions and refresh rates (a known Radeon 7000 weak point).

To the OP, I suspect we would get more useful data if you also asked for both GPU and memory clocks.
Both "idle" and video playback power usage has a lot to do with clocks. And various GPU's BIOSs and the respective drivers will have some kind of lookup table. Power and clocks for - as an example - 1080P @ 120Hz can be quite different than 1080P @ 144Hz if for the latter the driver and/GPU BIOS decide VRAM clocks have to go from 300MHz to 1000MHz. Sometimes VRAM clocks can be overridden but that can lead to glitches.
Yeah, more data would always be good, though I'd also guess that asking for too many clarifications could discourage people from taking part.

19 to 23 Watts, RX 7800 XT.

This is actual idle, having a browser open or even some apps is not idle as those use GPU acceleration, in which case you will see high 'idle' power, for me about 40 watts and the memory runs at between 1300 and 1500 Mhz, not the 45 Mhz you're seeing here.

I think you can change that by turning off GPU acceleration in those apps.

iorWviw.png
Yes, I agree. Browsers do use hardware acceleration. With that said, many GPUs will handle that type of load by only adding a few watts on top, while as others will jump up quite significantly. Others still may spike for a few seconds when a new page is loaded before returning to idle. You definitely can turn off GPU acceleration for most apps that use it. Your reading appears generally normal to my eyes.
 
Sapphire 9070XT Pulse
5120x1440p @ 240hz
7w idle desktop.

I'll test my 4090fe system later.
 
Last edited:
Sapphire 9070XT Pulse
5120x1440p @ 240hz
7w idle desktop.

I'll test my 4090fe system later.
We've got to give a shoutout to AMD. They went from having very hot, power-hungry cards to this amazing efficiency at low loads. 7w on that tier of card, with such an ultrawide and high refresh rate monitor is great!
 
Sapphire 9070XT Pulse
5120x1440p @ 240hz
7w idle desktop.

I'll test my 4090fe system later.
Idle as in with zero programs open at all?

I've got the same GPU.

Sapphire 9070XT Pulse
4k @ 240hz
32w with Firefox & Spotify running, with background programs FanControl & AMD Adrenalin running too.
 
Idle as in with zero programs open at all?

I've got the same GPU.

Sapphire 9070XT Pulse
4k @ 240hz
32w with Firefox & Spotify running, with background programs FanControl & AMD Adrenalin running too.
Yeah everything closed just sitting doing nothing on desktop. Only Adrenaline open to watch wattage.
 
Last edited:
8tXtds6.png


The second image is with five tabs open and one or two other windows, yet the power consumption and temperature is lower than before. Go figure.

hATsNvo.png
 
Last edited:
We've got to give a shoutout to AMD. They went from having very hot, power-hungry cards to this amazing efficiency at low loads. 7w on that tier of card, with such an ultrawide and high refresh rate monitor is great!
But they still run hot when gaming like the older cards, to the point everyone is undervolting them in the owners thread though.
 
Last edited:
But they still run hot when gaming like the older cards, to the point everyone is undervolting them in the owners thread though.
True. The 9070 and 9070XT do use more power than the 5070 and 5070 Ti under load, respectively. With that said, I undervolt all the GPUs and CPUs I get, regardless of brand.
 
True. The 9070 and 9070XT do use more power than the 5070 and 5070 Ti under load, respectively. With that said, I undervolt all the GPUs and CPUs I get, regardless of brand.
Yeah it's just not ideal that we have to do things like that :(
It's mad that AMD still tend to run 15-20C hotter than the Nvidia rivals - having owned owned both brands all my life, I just choose whichever suits my needs that generation based on the featureset.
I just couldn't believe that a 4070 runs at 64-66C if that in all AAA games, and yet a RX 6800 XT or 7900 XT/XTX was hitting 91-92C hotspot, that is a lot of heat by your feet or next to you on a desk!
Even the 4090 FE runs pretty cool despite being a monster. However the 5090 gets quite warm, but with the amount of juice going through it, that's hardly surprising :P
 
Yeah it's just not ideal that we have to do things like that :(
It's mad that AMD still tend to run 15-20C hotter than the Nvidia rivals - having owned owned both brands all my life, I just choose whichever suits my needs that generation based on the featureset.
I just couldn't believe that a 4070 runs at 64-66C if that in all AAA games, and yet a RX 6800 XT or 7900 XT/XTX was hitting 91-92C hotspot, that is a lot of heat by your feet or next to you on a desk!
Even the 4090 FE runs pretty cool despite being a monster. However the 5090 gets quite warm, but with the amount of juice going through it, that's hardly surprising :P
Nvidia removed the hotspot sensor from their latest GPUs - so I’m not sure we’ve got much to base it on when one side is no longer making that data available.

Not sure I agree that AMD run 15-20 hotter than Nvidia, feels like that’s very much finger in the air!
 
Nvidia removed the hotspot sensor from their latest GPUs - so I’m not sure we’ve got much to base it on when one side is no longer making that data available.

Not sure I agree that AMD run 15-20 hotter than Nvidia, feels like that’s very much finger in the air!
The hotspot on AMD cards tends to be usually 7-10C on average above the temp, going by what's displayed, so the Nvidia cards would still be 10C or more under :)

Well, having had both AMD and Nvidia... RX 6800 XT, RX 6950XT and 7900 XT/XTX runs at 90-92C peak hotspot, the 4070 I had ran at 64C so give that another 10-12C that's still only 76C at worst hotspot wise... A 4090 FE runs very cool and rather quietly...
I'm not the only person that's had these temps, so I'm not sure what there is to disagree on?

I enjoy overclocking, underclocking, undervolting etc, but we shouldn't have to do it... Nor tollerate a firepit next to ones leg/hands/face, let alone a loud one!
 
Last edited:
4080 Super (Gigabyte Gaming OC).
1440p 144Hz
Average 7 watt idle (actually 6.86 watt average) at desktop, 7-8 watt with Spotify and Firefox running.
 
Last edited:
Yeah everything closed just sitting doing nothing on desktop. Only Adrenaline open to watch wattage.
Yep, managed to get the same now just with nothing as well. Sat at 7-9w on desktop on a 1440p monitor, seems to stick around the 11-13w on my 4k one.
 
Surely max hotspot temp is not a useful metric unless you like to put you hands near the exhaust?

A room heated by 1,000W radiator which runs at 80⁰C versus a room heated by 3,000W of underfloor heating running at 30⁰C. Which room is going to be warmer?

Total wattage and wattage per frame are meaningful metrics. Max hotspot temp - assuming well within spec of the silicon - is not a meaningful metric IMO.

Actually while the 9070 XT is clocked near to the max from the factory, the plain 9070 was the most efficient card at release. Similarly, in its day the plain 6800 was the most efficient card - although for some reason efficiency was not important when Nvidia cheapened out with Samsung 8nm and AMD had the perf/watt crown!
 
Back
Top Bottom