• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is your graphics card inefficient or faulty? GPU idle power draw comparison

Surely max hotspot temp is not a useful metric unless you like to put you hands near the exhaust?

A room heated by 1,000W radiator which runs at 80⁰C versus a room heated by 3,000W of underfloor heating running at 30⁰C. Which room is going to be warmer?

Total wattage and wattage per frame are meaningful metrics. Max hotspot temp - assuming well within spec of the silicon - is not a meaningful metric IMO.

Actually while the 9070 XT is clocked near to the max from the factory, the plain 9070 was the most efficient card at release. Similarly, in its day the plain 6800 was the most efficient card - although for some reason efficiency was not important when Nvidia cheapened out with Samsung 8nm and AMD had the perf/watt crown!
Put it this way, when I've had the PC next to my leg or on the desk, it was the hotter build versus the Nvidia 4XXX I've owned, and the area of the room I was using the PC was a lot hotter :(
So real world is what matters and unfortunately, the AMD was hotter in real world use, also undervolting it never seemed to make any difference, it was very stable, but would just boost higher, so I achieved nothing in terms of quietening the fans, temps or power saving, which is great for some people, but not what I was trying to achieve.

In the AMD 9070/XT owners thread, they are undervolting them to bring down the temps/power draw. So I'm not alone, as I wasn't with the RX 6800 XT, 7900XT/XTX series.
I can put up with it, but there's no denying, especially in summer, I'd rather a cooler running PC/room, and it to run quieter ideally and not require headphones to enjoy using it, especially on a 3 fan card :(

I'll continue to own both brands, it'll just depend what one is better for me at the time of release, I'm not loyal to either, I was always more of an AMD guy, but I do like the modern featureset of the last few generations of Nvidia, and the speed at which those features come out and continue to be updated, along with the wide range of games supported, but that can change with any new release/generation, whoever makes it :) it is what it is.

Personally though, I'd love to see a 9950XTX be released, and buy that next :D
 
Last edited:
Put it this way, when I've had the PC next to my leg or on the desk, it was the hotter build versus the Nvidia 4XXX I've owned, and the area of the room I was using the PC was a lot hotter :(
So real world is what matters and unfortunately, the AMD was hotter in real world use, also undervolting it never seemed to make any difference, it was very stable, but would just boost higher, so I achieved nothing in terms of quietening the fans, temps or power saving, which is great for some people, but not what I was trying to achieve.
Everyone in this thread seems to be undervolting them too to bring down the temps/power draw. So I'm not alone here, as I wasn't with the RX 6800 XT, 7900XT/XTX series.
I can put up with it, but there's no denying, especially in summer, I'd rather a cooler room/PC, and it to run quieter and not have to have headphones on, especially on a 3 fan card :(

As I say though, I'll continue to own both brands, it'll just depend what one is better for me at the time of release, I'm not loyal to either, well to be fair, I was always more of an AMD guy, but I do like the modern featureset of the last few generations of Nvidia, and the speed at which those features come out and continue to be updated, along with the wide range of games supported, it is what it is.

Personally though, I'd love to see some crazy 9950XTX or something come out, and grab that next :D
Well, to lower the heat what you need to do is lower the power consumption. On AMD cards this is done through power limiting. By doing this, you can lower the power target and hence reduce power consumption. This generates less heat, lower temps, quieter fans and less heating of your room, in exchange for lower clocks and hence - performance loss. Undervolting is often used in conjunction with this, usually to offset the performance loss caused by the power limit decrease. If left without a power limit decrease, the undervolt will just result in a higher clock, which will increase performance beyond stock levels, but won't decrease heat output as the power consumption hasn't changed - the card has simply clocked higher with the new headroom that the undervolt provided. The two combined will allow you to keep your stock levels of performance and use less power, causing less heat. If you'd like to take it even further, you can power limit even more and accept some performance loss in exchange for a much cooler card. Most AMD cards will allow you to slice off about a third of their overall power consumption. A 300W card can become a 200W card this way, saving you the heat output of about 10 modern new LED light bulbs, or one old-fashioned incandescent bulb.
 
Well, to lower the heat what you need to do is lower the power consumption. On AMD cards this is done through power limiting. By doing this, you can lower the power target and hence reduce power consumption. This generates less heat, lower temps, quieter fans and less heating of your room, in exchange for lower clocks and hence - performance loss. Undervolting is often used in conjunction with this, usually to offset the performance loss caused by the power limit decrease. If left without a power limit decrease, the undervolt will just result in a higher clock, which will increase performance beyond stock levels, but won't decrease heat output as the power consumption hasn't changed - the card has simply clocked higher with the new headroom that the undervolt provided. The two combined will allow you to keep your stock levels of performance and use less power, causing less heat. If you'd like to take it even further, you can power limit even more and accept some performance loss in exchange for a much cooler card. Most AMD cards will allow you to slice off about a third of their overall power consumption. A
Unfortunately, that isn't the most ideal way to do it, by doing that you can introduce stutters and are bottlenecking the cards performance by a fair bit in real world usage, I've personally never had a good experience doing it that way, it's the 'quick and dirty' way of doing it.
Don't get me wrong, you can do it that way, but that often incurs issues.

With undervolting, I prefer to get the card to run either as close to stock clocks as possible with less power, or ever so slightly underclock it with less power, but determine at each point what voltage it can have, that has always worked best in Adrenaline.
I do however much prefer to use MSI Afterburner where possible with GPU's.
 
True. The 9070 and 9070XT do use more power than the 5070 and 5070 Ti under load, respectively. With that said, I undervolt all the GPUs and CPUs I get, regardless of brand.

The RX9070 doesn't use more power than an RTX5070:

Under normal gaming load there is a 4W difference,but under RT,Furmark,etc the RTX5070 uses more power. TPU actually measures card power draw.

But they still run hot when gaming like the older cards, to the point everyone is undervolting them in the owners thread though.

My RX9070 Pulse runs fine in a 12L NCase M2. I would be more worried about how hot an RTX5070 runs because it is drawing as much power or even more than an RX9070 over a much smaller die size.

The RX9070 is one of the most efficient gaming cards of this generation.
 
Last edited:
The RX9070 doesn't use more power than an RTX5070:

Under normal gaming load there is a 4W difference,but under RT,Furmark,etc the RTX5070 uses more power. TPU actually measures card power draw.



My RX9070 Pulse runs fine in a 12L NCase M2. I would be more worried about how hot an RTX5070 runs because it is drawing as much power or even more than an RX9070 over a much smaller die size.

The RX9070 is one of the most efficient gaming cards of this generation.
Hmm, this is interesting. Don't know if I watched a review with different numbers or simply misremembered. That 9070 is really sipping - both at idle and under load.
 
My RX9070 Pulse runs fine in a 12L NCase M2. I would be more worried about how hot an RTX5070 runs because it is drawing as much power or even more than an RX9070 over a much smaller die size.

The RX9070 is one of the most efficient gaming cards of this generation.
Nice, that's decent.
Yeah that's a fair point, if both generations are getting as hot as the RX 6800XT/6900XT/6950XT/7900XT/7900XTX, then I'll skip upgrading this generation.

Having had various models of both brands in the previous generation and generation before it, Nvidia was physically cooler when sat next to the PC to the point my foot/leg got rather hot, and having it on my desk was even worse, along with the area/room being noticeably hotter, uncomfortably so in the summer heatwaves we had.
Pair that with the fans running at full whack and having to wear headphones to drown it out...
So that made enjoying gaming at a desk, pretty much a no go.
 
6900XT
3840x2160 144Hz
Single Display
GPU-Idle-Wattage.jpg


Lowest I see is 9w. But, I get these very odd peaks and troughs in the voltage. Nothing open and when viewing GPU usage in task manager it is just Adrenaline and Desktop Windows Manager using a very small amount. But it will spike up to ~835mV and drop back to 0mV as shown. When it peaks it goes to 30w. When it goes to zero, it goes to 9w.
 
Hmm, this is interesting. Don't know if I watched a review with different numbers or simply misremembered. That 9070 is really sipping - both at idle and under load.

TPU measure actually card power draw. If you look at most reviews which measure card power draw,either the RX9070 on average is matching or exceeding the RTX5070 in overall effiency.

Nice, that's decent.
Yeah that's a fair point, if both generations are getting as hot as the RX 6800XT/6900XT/6950XT/7900XT/7900XTX, then I'll skip upgrading this generation.

Having had various models of both brands in the previous generation and generation before it, Nvidia was physically cooler when sat next to the PC to the point my foot/leg got rather hot, and having it on my desk was even worse, along with the area/room being noticeably hotter, uncomfortably so in the summer heatwaves we had.
Pair that with the fans running at full whack and having to wear headphones to drown it out...
So that made enjoying gaming at a desk, pretty much a no go.

My RX9070 Pulse runs fine - its a 220W dGPU at best and is around RTX3070 in power draw.

The RTX5070 draws similar to slightly more power than an RX9070,but the distribution is different:
1.)The RX9070 uses a cut down 357MM2 die and the RTX5070 a 263MM2 die
2.)The RX9070 uses less efficient GDDR6 over a 256 bus and the RTX5070 more efficient GDDR7 over a smaller 192 bus,ie,less RAM chips.
3.)More of the power dissipated on the RTX5070 is going to be via the GPU core,but over a much smaller area. The core will run warmer.
4.)More of the power dissipated by the RX9070 will taken up by the 256 bit GDDR6 memory subsystem. However,GDDR6 seems to be able to tolerate upto 105C,but GDDR7 only upto 95C.

If you look at data TPU has compiled for its RTX5070 and RX9070 cards:
1.)The RTX5070 cards run their GPU cores hotter than the RX9070 cards. The RX9070 cards they tested generate about 22 to 25 dBA of noise - the RTX5070 cards range from 22 to 38 dBA!
2.)The RX9070 cards run their VRAM warmer,although it could be because the RX9070 coolers run very quiet. Apparently rising the fans slightly higher helps in this regard.
 
Last edited:
TPU measure actually card power draw. If you look at most reviews which measure card power draw,either the RX9070 on average is matching or exceeding the RTX5070 in overall effiency.



My RX9070 Pulse runs fine - its a 220W dGPU at best and is around RTX3070 in power draw.

The RTX5070 draws similar to slightly more power than an RX9070,but the distribution is different:
1.)The RX9070 uses a cut down 357MM2 die and the RTX5070 a 263MM2 die
2.)The RX9070 uses less efficient GDDR6 over a 256 bus and the RTX5070 more efficient GDDR7 over a smaller 192 bus,ie,less RAM chips.
3.)More of the power dissipated on the RTX5070 is going to be via the GPU core,but over a much smaller area. The core will run warmer.
4.)More of the power dissipated by the RX9070 will taken up by the 256 bit GDDR6 memory subsystem. However,GDDR6 seems to be able to tolerate upto 105C,but GDDR7 only upto 95C.

If you look at data TPU has compiled for its RTX5070 and RX9070 cards:
1.)The RTX5070 cards run their GPU cores hotter than the RX9070 cards. The RX9070 cards they tested generate about 22 to 25 dBA of noise - the RTX5070 cards range from 22 to 38 dBA!
2.)The RX9070 cards run their VRAM warmer,although it could be because the RX9070 coolers run very quiet. Apparently rising the fans slightly higher helps in this regard.
I can't, and wont fault anything that you've said mate :)

I like that you've bothered to make a well structed and informative post - this not only helps teach people that want to learn, but comes across neutral, without being cocky/smug or rude, I really respect this way of educating people, so thanks for that :)

In this scenario, I'd agree the RX9070 wins overall then :) But when it comes to the bigger brother, the RX 9070XT, I think regardless a price difference, I'd spend go for the 5070Ti?
The featureset, off the shelf supported games that can utilise said features, and the constantly updated support for said features, along with Nvidia currently pushing more frames with said tech (fake or not it is true, for now) I think it wins.

I would however, as previously aforementioned, love to see a RX 9950XT/XTX take the crown ;) I will always bat for AMD, especially CPU wise, and I'm always pleased to see them do well with GPU's, I just want them to win 'big time' versus from being a 'bang per buck' ratio, I want AMD to make a 4090 'of it's time' type revolution :) That would be awesome, would it not? :D
 
Back
Top Bottom