• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GPU Power Consumption Question

Associate
Joined
21 Jun 2010
Posts
787
Location
infornt of my PC
OK another daft question.
Lets say I buy a graphics card that is all new and shiny and the latest 'must have'.
Some reviews say its power hungry and for simplicity it consumes around 300 watts (im guessing per hour).
Does this mean it constantly draws that amount of power, for example if I am
streaming or working as well as gaming?

or

Does that mean it would draw that amount of power when its pushed to its limits ?
 
Power usage will vary depending on what you are doing. I've got an 1200W Asus Thor PSU which actually has a display confirming the usage and just web browsing its drawing 104W, obviously this goes up if I am gaming or doing something else intensive.
 
Does that mean it would draw that amount of power when its pushed to its limits ?

This. Modern GPUs will use approx 5-15w on idle, depending on how many screens/the resolution/refresh rate you're pushing. The rest of your machine will vary however, depending on what it is/what you're doing.

Also, watts isn't a unit of measure in the way you think it is, watts just refers to power and has no link to time, you're thinking of watt hours (Wh) which will relate nicely to the Kwh you are billed by.
 
So the GPU won't constantly be clocking up £££ on my electricity bill if I am not intensely gaming?
There are a few things that slightly increase idle power, like multiple monitors, playing videos and some software with 3D things like graphs and maps, but like said above they're usually around 5 - 15 watts when on the desktop because they underclock the graphics and memory, just like a CPU does. Have a look at the techpowerup reviews of any recent graphics card to get an idea what your card will use, generally the more high-end it is, the higher the idle power, but it's not a fixed rule. VBIOS and drivers can slightly influence this too (up to a few watts either way) and later revisions of graphics cards tend to be more efficient than older ones.
 
This. Modern GPUs will use approx 5-15w on idle, depending on how many screens/the resolution/refresh rate you're pushing. The rest of your machine will vary however, depending on what it is/what you're doing.

Also, watts isn't a unit of measure in the way you think it is, watts just refers to power and has no link to time, you're thinking of watt hours (Wh) which will relate nicely to the Kwh you are billed by.

yep that's what I was thinking
 
In my experience the advertised power usage of the GPU is when it is playing a modern demanding game at maximum frame rates. My Vega 56 uses about 5w while on this forum, and I use radeon chill to reduce power usage in games. With the games I usually play it uses less than 80 watt rather than the advertised over 200 watt usage, hopefully extending its life
 
Well I just discovered that my ancient Radeon R9 290x uses between 275 and 315 Watts on full load.
Kinda puts things in to perspective. This card is 8 years old, I guess efficiency has improved over the years and now
with more powerful cards power consumption is naturally increasing again.

I did chat with a 1st line Nvidia techie who believed the card consumes 290 Watts all the time even at idle or low loads.
 
Well I just discovered that my ancient Radeon R9 290x uses between 275 and 315 Watts on full load.
Kinda puts things in to perspective. This card is 8 years old, I guess efficiency has improved over the years and now
with more powerful cards power consumption is naturally increasing again.

I did chat with a 1st line Nvidia techie who believed the card consumes 290 Watts all the time even at idle or low loads.
While efficiency has improved top end graphics card still use 300w + due to the fact that the gpu die can fit more on hence needing more power.
 
Well I just discovered that my ancient Radeon R9 290x uses between 275 and 315 Watts on full load.
Kinda puts things in to perspective. This card is 8 years old, I guess efficiency has improved over the years and now
with more powerful cards power consumption is naturally increasing again.

I did chat with a 1st line Nvidia techie who believed the card consumes 290 Watts all the time even at idle or low loads.
but it heats the room well
 
So I put this question to Nvidia's tech support.
How much power does the 3070 Ti draw when its no being used for graphic intensive work such as being used for apps like Office or surfing the internet?
1st answer. It needs 290 watts to run.. (Thats it)

I asked again.
They gave me a link to 2 non existent pages and said look at 3rd party data.
They also advised that you cant run their card along side AMDs ....huh!?! what the ?

Either they are clueless and don't know about their own products or they don't want to tell me.

So I broke it down to very simple words and asked them once more what consumption I could expect while basic stuff.

We will see what gibberish they reply with this time
 
So I put this question to Nvidia's tech support.
How much power does the 3070 Ti draw when its no being used for graphic intensive work such as being used for apps like Office or surfing the internet?
1st answer. It needs 290 watts to run.. (Thats it)

I asked again.
They gave me a link to 2 non existent pages and said look at 3rd party data.
They also advised that you cant run their card along side AMDs ....huh!?! what the ?

Either they are clueless and don't know about their own products or they don't want to tell me.

So I broke it down to very simple words and asked them once more what consumption I could expect while basic stuff.

We will see what gibberish they reply with this time


Why do you even care

and to answer your question it will be under 50w
 
So I put this question to Nvidia's tech support.
How much power does the 3070 Ti draw when its no being used for graphic intensive work such as being used for apps like Office or surfing the internet?
1st answer. It needs 290 watts to run.. (Thats it)

I asked again.
They gave me a link to 2 non existent pages and said look at 3rd party data.
They also advised that you cant run their card along side AMDs ....huh!?! what the ?

Either they are clueless and don't know about their own products or they don't want to tell me.

So I broke it down to very simple words and asked them once more what consumption I could expect while basic stuff.

We will see what gibberish they reply with this time
You do need to look at third party data, like the TPU reviews I suggested earlier, because it varies depending on the silicon, the model, manufacturer, monitor(s) and even the drivers. It's like asking Intel what power a motherboard uses, any ballpark figure they give you is vague enough that you can't really make any decisions from it. There can be a relatively big difference, even at the low-end. If you assume it'll be somewhere around 10-20 watt for a high-end card, then that's about right. You'll usually find that server (or htpc) forums will give you the best comparisons because they optimises for idle and a few models and manufacturers will consistently lead.
 
OK another daft question.
Lets say I buy a graphics card that is all new and shiny and the latest 'must have'.
Some reviews say its power hungry and for simplicity it consumes around 300 watts (im guessing per hour).
Does this mean it constantly draws that amount of power, for example if I am
streaming or working as well as gaming?

or

Does that mean it would draw that amount of power when its pushed to its limits ?

Power usage for GPUs is always stated as peak draw. The maximum power the GPU can draw assuming no modifications.

Watts is a unit measure of power draw and not related to time, it's just power draw in a specific instant of time.

Graphics cards draw radically different amounts of power depending on what they're doing. Most of the time they're idle and doing nothing other than rendering your 2D desktop and draw a trivial amount of power. If you watch things like compressed videos, most software uses the GPU to decode them and you get very mild usage and mild power draw. But when you do something that requires 3D acceleration such as playing a game they draw a lot more power, close to, or at peak.

Typically you get 2 measures of power draw for the GPU, you get the raw power it's drawing from the PSU which GPU reviewers will sometimes give. This isn't always very helpful because the only way use this information to decide what PSU you need, you also need to know the power draw from the rest of the system, and that gets complicated. Or more often, and more useful, the GPU manufacturers will tell you a minimum recommended PSU wattage for a specific GPU. This will be what PSU wattage you need to run the entire PC with their video card installed, assuming a fairly typical PC with typical components.

So for example if you visit MSIs website and look at the RTX 3080 specs they have a recommended PSU of 750W https://us.msi.com/Graphics-Card/GeForce-RTX-3080-GAMING-X-TRIO-10G/Specification and this is a good enough measure for 99% of people. You'd need a really specialized system with a lot of components inside it to exceed the recommended PSU values. If you're still quite new to PC building then it's best to use these recommended values.
 
Back
Top Bottom