Greetings my Technological Brethren, 
I been meaning to ask this question for a while. I have read a few bits here and there on the web that a CPU draws more power when it is running hot?
Is this true?
Can anyone explain this to a non-engineer?
I know that no hardware is identical but for arguements sake lets say you have two identical machines (same specs, same MHz etc) however one is well cooled and the other is running a little toasty. . . . Why would the CPU in the hot machine need more juice?
I am just curious to find out if running a hot CPU will mean higher electricity bills from some 24/7 machines.
Thanks in advance!

I been meaning to ask this question for a while. I have read a few bits here and there on the web that a CPU draws more power when it is running hot?
Is this true?
Can anyone explain this to a non-engineer?

I know that no hardware is identical but for arguements sake lets say you have two identical machines (same specs, same MHz etc) however one is well cooled and the other is running a little toasty. . . . Why would the CPU in the hot machine need more juice?
I am just curious to find out if running a hot CPU will mean higher electricity bills from some 24/7 machines.
Thanks in advance!
