• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Surely power consumed = heat?

Joined
4 Aug 2007
Posts
22,576
Location
Wilds of suffolk
A few reviews seem to make a fuss about heat, well and some people as soon as you mention Fermi go oooo hot / load of power, noisy.

Now unless I am missing something power = heat. Lets ignore cooling initially.
Just like a processor more mhz requires more power = more heat.

Unless I am missing something pretty much there is a 100 correlation between power consumption and heat generated? Every electrical device complys with the same rules of physics...

Now lets look at cooling, so assuming there is a 100% perfect correlation between power consumption and heat generated for a card to run a lot hotter than another (similar manufacturing assumed say between fermi and AMD 6xxx series) then the diff must come down to the actual cooling vs the power being consumed. Pass more air and you get lower temps as you are moving more of the heat, or monitor the heat in a diff way and you get a diff lower reading.

So the actual heat created when expressed as heat energy would be the same per electricity watt from every card? Unless I am missing something fundamental? So the actual heat of the chips themselves really doenst mean a lot other than affecting overclocking potential etc.

So if thats all true why do people even mention heat, surely its a completely irrelevant fact. By measuring power your measuring the energy being consumed which is being turned into heat, but if a card was running at 2000C because it was a sealed box its no more of an issue than a card running at 30C because its liquid nitrogen cooled. Thats assuming that the heat is transferred away from anything that may be affected by the heat, eg a processor. Years ago when you started to get hot cards that did not vent externally I could see the issue as you HAD to have good case cooling, now with most cards venting externally do you really care about the heat generated?
 
You want to turn the energy into work not into heat.

If you set off a bomb you get the saved up energy in the bomb turning into sound, light and blast, the light and sound don't help the bomb really and you'd rather just get a silent, dark bomb that converts all it energy into a blast - But you can't really do it.

In the same way your inputted power (the bomb) into a cpu/gpu is converted in to the work thats been done (light, sound & blast) but again some of the energy is lost into areas you'd rather not be there like wasted heat.
 
It isn't a 100% relationship between heat and energy supplied, you also need to take into account the efficiency of the device, i.e. how much useful work it does from the energy you supply it. Unfortunately, heat and efficiency are usually inversely proportional, so the hotter something gets, the worse its efficiency, leading to it putting out more heat reducing the efficiency, and so on. Also, many of the components do not particularly like running at high temperatures.
 
Last edited:
Back
Top Bottom