Is it true that a hot CPU uses more Juice than a cool CPU?

Soldato
Joined
11 Sep 2003
Posts
14,716
Location
London
Greetings my Technological Brethren, :cool:

I been meaning to ask this question for a while. I have read a few bits here and there on the web that a CPU draws more power when it is running hot?

Is this true?

Can anyone explain this to a non-engineer? :o

I know that no hardware is identical but for arguements sake lets say you have two identical machines (same specs, same MHz etc) however one is well cooled and the other is running a little toasty. . . . Why would the CPU in the hot machine need more juice?

I am just curious to find out if running a hot CPU will mean higher electricity bills from some 24/7 machines.

Thanks in advance! :)
 
it will put out exactly the same amount of heat no matter how fast heat is being moved away from the cpu, i see no reason why less heat being removed from the cpu would cause it to use more power
 
Yes within limits.

I had a AMD Mobile Barton CPU that needed 1.75v to do 2.5GHZ cooled by a 90mm Heatpiped Cooler, someone I knew had near same setup but Water Cooling, he ran same speed with 1.65v.

I later got a 120mm Heatpiped Cooler and was able to drop voltage to 1.65v.

All Prime95 stable.

The CPU is more efficient cooler again within limits, why do you thing peeps can overclock more after Lapping CPU and Heatsinks, or run same speed as before but lower voltage.
 
Greetings my Technological Brethren, :cool:

I been meaning to ask this question for a while. I have read a few bits here and there on the web that a CPU draws more power when it is running hot?



I am just curious to find out if running a hot CPU will mean higher electricity bills from some 24/7 machines.

Thanks in advance! :)


Not 'draw' more power in that sense, just might require more vcore to remain stable as the heat causes ineffiency in the transistors. However, more vcore means more heat, so its a slippery slope :D
 
Assuming the same voltage and the same speed then temperature shouldn't make any difference. It's a bit misleading to say that a cool chip consumes less electricity because you're pushing less voltage through it as that's taking two variables which makes it an unfair test (or theory as the case may be).
 
Yes it does and if you know anything about Electronics you would know that. (again within limits).

What part does nobody not understand about my above own example, when all I did was change cooler and was able to run same clock with less voltage cause CPU was cooler ?.

This in turn gave me more headroom to then overclock higher.

BTW, Im talking for myself no anyone elses posts here.
 
Thanks for replies :)

I am of course aware that regarding overclocking, a cooler chip is desirable but I am thinking of this subject from another angle which is lower running costs and being ECO-Friendly

I have always thought of premium 3rd Party cooling as a way of achieving big overclocks but never as a way of running a chip at stock (ish) speeds with very little voltage.

i.e People who don't want to overclock would still benefit from paying for premium cooling and 'undervolting' their now super-cool chips.

While overclocks of 4GHz to 5GHz and amazing benchmark results attract a lot of attention in our geeky-world, it seems running costs and general eco-concerns get shoved way down the list of things people find interesting.

I have noticed more and more advertising is now being aimed at so called GREEN features and slowly I am starting to pay more attention to it.

I did really want to get a Q6600 G0 and overclock it to at least 3600MHz but having seen this chart I changed my mind! :eek:

core2powerxq1.jpg


My new subject of interest for 2008 is getting as much performance out of computers while at the same time reducing my electricity costs, seems like my experience of overclocking and cooling could be used to help me achieve my aims here, I don't expect that many people will be interested but hopefully one or two folks will be?
 
Problem: Cost of 3rd party and more effecient cooling may override whatever you save on electricity, possibly even cost more per annum. I'd undervolt with the stock cooler and most cpus will run stock speed at drastically reduced vcore.
 
Colder CPUs will actually use more power if you keep the voltage the same since the reduced resistances will increase the current flow and P = I.I.R.

Of course you will be able to run the CPU with less volts for a given clock speed which should reduce the power back down to less than it was before.
 
Problem: Cost of 3rd party and more effecient cooling may override whatever you save on electricity, possibly even cost more per annum
Indeed, not sure about that but will let you know, would still be using less power though.

I'd undervolt with the stock cooler and most cpus will run stock speed at drastically reduced vcore.
That's what I have been doing with suprisingly good results however even when the stock cooler is put on reduced rpms (ASUS Q-Fan etc) its still produces some noise, I am thinking of running a premium cooler passive (silent low voltage system), then at the end of the year I can put all the money I saved towards. . . . . a pint of milk or something useful! :D
 

Take a CPU running at 1V, with 100A going through it, generating 100W (meaning our chip has an effective resistance of 0.01R) of heat at one temperature, now if we reduce temperature, which will reduce the various circuit resistances in the CPU from 0.01R to say 0.0095R (so a 5% drop, maybe a bit extreme for the sort of temperatures we're talking about).

I = V/R = 1/0.0095 = 105A

P = IV, where V also = IR giving us:-

P = IIR = 105 x 105 x 0.0095

so

P = 104W

But you can run colder CPUs with less voltage which will reduce the overall power output back down to what it was before and maybe less.
 
Take a CPU running at 1V, with 100A going through it, generating 100W (meaning our chip has an effective resistance of 0.01R)
Thanks for the effort Jokester, it obvious from your post that there is a huge difference in our levels of engineering knowledge! :o

I'm happy to take what you say as gospel as I have no way of understanding any of this! :D

I don't know the difference between a volt, an amp or indeed a watt, or what their relationship to each other is, also TDP, resistance etc

I wonder, could you put that brain of yours to work for me and do some calculations on the following. . .

Take an E6300 chip running in two scenarios

#1 running at stock 1.86GHz, 1.28vCore actual, load temp of 45°C

#2 running at 1.86GHz, 1.06vCore actual, load temp of 32°C


How much less power (watts?) would example #2 use over a 24 hour period?

thanks for the loan of the grey matter! :)
 
Only practical way of doing it is to use a socket power meter and to test the chip with Prime or something and noting the Wattage in each case I'm afraid.
 
lower temps are in theory reduce power unless the power usage doesnt care about temperature in that case it will remain same. (so you gotta configure it)

but lower temps = less resistance, but 5-10 degrees makes little difference, if you wanted to go insane id say go around -220 or so.
 
Back
Top Bottom