• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What effect does lowering voltages have on wattages.

I'm looking at it from the perspective of your average enthusiast and trying to find a middle line between those who only have their PC on when they are likely to use it and those who leave it on 24x7.

Strictly my opinion but idle useage from my experience shows very little difference in power consumption from decreased voltage. Taking into account that a lot of PSUs aren't hideously efficent and your probably drawing very close to the same current from the wall to the point it makes no difference. Average desktop useage over several hours doesn't really make enough use of the CPU to move the base line significantly above idle.

Its rare even with compiling, video encoding, gaming, etc. for the CPU to be 100% in use so I'd say even at load your looking at more like a 70% use over a few hours.

So assuming 8 hours of desktop use and being generous and saying you've reduced power draw from the wall by 0.5watt and 2-3 hours where you've reduced it by 10watt, at 12p per kwh thats less than 0.5pence saved a day. If we were overly generous and assumed you were saving a massive 10watt at idle and 20-30watt off the load amount your still talking in the region of 2p a day or 60p a month.

Based on this and fudging it a bit you'd need 14 million average enthusiast computer users to reduce their power useage in this manner to decommission 1 nuclear power station. Or 3 million if we went for really high end optimistic savings.
 
Assume power consumption is the same as TDP, so 95W, then assume 1.15V as stock voltage or something.

That means it would pull around 83A at full pelt. Then if you set voltage to .9V then it would pull around 75 watts. Some would call this significant, others wouldn't care at all.

Your current won't remain constant, you're better using P=R * V^2

i.e. your power will change in line with the square of your voltage - halve the voltage, and you'll halve the power.
 
So if we are gonna make this a progressive discussion, if people are interested in reducing their power use/carbon footprint, etc. they are better off looking at the hardware they are purchasing i.e. buying a CPU that has lower power consumption at idle from the off, buying a CPU thats more fitted to their actual needs i.e. if you only ever play low spec games and do a bit of internet browsing do you really need an i7, etc.
 
Hi Duff-Man,
Are you implying that anyone present can't do basic arithmetic? ;)

No, quite the opposite - I'm just trying to point out that, once you know the basic physics (energy = power*time), then there is no heavy math involved whatsoever :)

Just trying to persuade people that they can do this too. People are often put off by anything involving a formula, and assume that it's beyond them, when it really isn't :)

Sorry if I came off as condescending - that wasn't my intention.
 
So if we are gonna make this a progressive discussion, if people are interested in reducing their power use/carbon footprint, etc. they are better off looking at the hardware they are purchasing i.e. buying a CPU that has lower power consumption at idle from the off, buying a CPU thats more fitted to their actual needs i.e. if you only ever play low spec games and do a bit of internet browsing do you really need an i7, etc.

It's an optimisation problem. You want to minimise your power output, subject to maintaining a performance that you find acceptable. In order to do this you want to be looking at "performance per watt" comparisons for various pieces of hardware.

To take an example from GPUs; nvidia may have the best performing cards overall (GTX480), and they may even have the best "performance per £" card (the GTX460), but they sure as hell don't have the best "performance per watt" card, and no amount of under-volting of a GTX480 is going to give you the same performance-per-watt as you would get from a 5870 (say).

In short: The hardware is the most important factor. Manually altering the voltage characteristics is secondary.
 
I don't think its quite that simple tho...

When you throw the PSU, motherboard power supply/regulation circuitry, CPU leakage, etc. into the equation even tho you might have reduced the CPU voltage other stages may not be significantly reducing the power they are drawing to supply that to the CPU.
 
I don't think its quite that simple tho...

When you throw the PSU, motherboard power supply/regulation circuitry, CPU leakage, etc. into the equation even tho you might have reduced the CPU voltage other stages may not be significantly reducing the power they are drawing to supply that to the CPU.

To what, exactly, are you referring? :confused:

Clearly, reducing the voltage to one particular component will only affect the power draw from that component, and not from any others that have independent voltage characteristics. Although, there are secondary interaction effects (for example, a CPU that operates at a lower clockspeed makes less calls to the memory, and so slightly reduces the current draw of the memory etc).

Nothing is ever "that simple" when it comes to complex interacting systems, but by considering the most significant components, and analysing how they react to changes in the most important factors (in this case voltage / clockspeed / workload etc), you can make fairly accurate predictions about overall power requirement changes, within a small window of change.

Of course, when you start to move outside the expected operating range of a given architecture, things stop behaving in the same way as you might otherwise expect. This will always be the case with complex electronics.
 
...So assuming 8 hours of desktop use and being generous and saying you've reduced power draw from the wall by 0.5watt and 2-3 hours where you've reduced it by 10watt

...

Based on this and fudging it a bit you'd need 14 million average enthusiast computer users to reduce their power useage in this manner to decommission 1 nuclear power station.

Your estimate of the number of computers need is a little off, though the broad point you are making is still very valid. Just for the record:

8hrs at 0.5W reduction + 3hrs at 10W reduction => an average of 1.42W reduction throughout the day (remember we're dealing with daily figures here).

So, for a 1000MW nuclear power plant, you would need 704Million PCs to be reduced in the manner you describe before you could decommission a nuclear plant.
 
Well what I'm trying to say some parts may still be drawing the same or very similiar input current due to the way they operate even when supplying a lower output voltage. So you might drop the vcore on the CPU from 1.2v to 0.95v but the supply stage on the board is still drawing the same or very close to the same input even tho its not the same output due to the way it works.
 
Well what I'm trying to say some parts may still be drawing the same or very similiar input current due to the way they operate even when supplying a lower output voltage. So you might drop the vcore on the CPU from 1.2v to 0.95v but the supply stage on the board is still drawing the same or very close to the same input even tho its not the same output due to the way it works.

Of course.

Like I said, you can only consider things component-by-component, and then understand the overall system as the sum of constituent components.
 
Your estimate of the number of computers need is a little off, though the broad point you are making is still very valid. Just for the record:

8hrs at 0.5W reduction + 3hrs at 10W reduction => an average of 1.42W reduction throughout the day (remember we're dealing with daily figures here).

So, for a 1000MW nuclear power plant, you would need 704Million PCs to be reduced in the manner you describe before you could decommission a nuclear plant.

I didn't put my figures very well I meant 8x0.5w, 3x 10 so 34watt through the day - and I based it on the low end of a 500MW power plant heh. Maths isn't really my strong point :P
 
I didn't put my figures very well I meant 8x0.5w, 3x 10 so 34watt through the day - and I based it on the low end of a 500MW power plant heh.

Yeah... 34Whrs of power per day is an average power output of (34/24) = 1.42W.

So, for a 500MW power plant you would require 352Million PCs.

You were just mixing up hours and days as your units, that's all.
 
Which all leads back to the point IMO that in terms of saving your money or the planet the effects of under-volting are negligible - unless your running your PC with high CPU usage 24x7, if it concerns you your better off looking at performance/watt next time your purchasing.
 
Last edited:
Here some quick calculations for someone who is involved in distributed computing or other heavy load tasks that keep a PC running pretty much flat-out . . .

  • 24 hour a day
  • 6 days a week
  • 48 weeks a year
Plenty of "slack" in the above figures . . .

  • 144 Hours/Week
  • 576 Hours/Month
  • 6912 Hours/Year

  • 0.14 @ kWh (100% renewable energy rate)

  • 300 watts
    • £24.19 Month
      • £290.28 Annual
  • 250 watts
    • £20.16 Month
      • £241.92 Annual

  • 200 watts
    • £16.12 Month
      • £193.54 Annual

So based on the above heavy usage patterns (fed by 100% renewable energy) and choosing the right power efficient hardware and "tweaking" their system with a good balanced "sweetspot" clock to reduce the wattage draw from 300 watts to 200 watts you've reduced the annual kWh's by 691.2 and saved enough money to get the Champagne in for xmas! . . . YMMV but I think anyone who tweaked their set-up could get a bottle at least! :cool:

powerreduction.gif
 
Which all leads back to the point IMO that in terms of saving your money or the planet the effects of under-volting are negligible - unless your running your PC with high CPU usage 24x7, if it concerns you your better off looking at performance/watt next time your purchasing.

Agreed :)

As I said, your main point is still valid. Even more so in fact, from the bigger number of computers.



Big Wayne - good way of looking at it in principle, certainly. But to reduce overall system power by 33% you're looking at (say) a 50% reduction in CPU power. It's hard to see where this could come from without sacrificing any performance...

But yes. With several servers on the go 24/7 (or thereabouts), I can see where you would pull a couple of hundred £ over the year. And what better way to spend this saved money than on booze for the office Christmas do?
 
We could always add some PF correction and bring the lagging load upto say 0.96 or greater which would greatly save on the cost of running.
 
I was reffering more to the incoming supply before it reaches the Psu :D (for those trying to save money on there electric bills)

On a side note Jokester, Have you ever been able to get earth leakage figures from a Psu manufacturer?
 
Last edited:
Back
Top Bottom