Soldato
- Joined
- 4 Aug 2007
- Posts
- 22,576
- Location
- Wilds of suffolk
A few reviews seem to make a fuss about heat, well and some people as soon as you mention Fermi go oooo hot / load of power, noisy.
Now unless I am missing something power = heat. Lets ignore cooling initially.
Just like a processor more mhz requires more power = more heat.
Unless I am missing something pretty much there is a 100 correlation between power consumption and heat generated? Every electrical device complys with the same rules of physics...
Now lets look at cooling, so assuming there is a 100% perfect correlation between power consumption and heat generated for a card to run a lot hotter than another (similar manufacturing assumed say between fermi and AMD 6xxx series) then the diff must come down to the actual cooling vs the power being consumed. Pass more air and you get lower temps as you are moving more of the heat, or monitor the heat in a diff way and you get a diff lower reading.
So the actual heat created when expressed as heat energy would be the same per electricity watt from every card? Unless I am missing something fundamental? So the actual heat of the chips themselves really doenst mean a lot other than affecting overclocking potential etc.
So if thats all true why do people even mention heat, surely its a completely irrelevant fact. By measuring power your measuring the energy being consumed which is being turned into heat, but if a card was running at 2000C because it was a sealed box its no more of an issue than a card running at 30C because its liquid nitrogen cooled. Thats assuming that the heat is transferred away from anything that may be affected by the heat, eg a processor. Years ago when you started to get hot cards that did not vent externally I could see the issue as you HAD to have good case cooling, now with most cards venting externally do you really care about the heat generated?
Now unless I am missing something power = heat. Lets ignore cooling initially.
Just like a processor more mhz requires more power = more heat.
Unless I am missing something pretty much there is a 100 correlation between power consumption and heat generated? Every electrical device complys with the same rules of physics...
Now lets look at cooling, so assuming there is a 100% perfect correlation between power consumption and heat generated for a card to run a lot hotter than another (similar manufacturing assumed say between fermi and AMD 6xxx series) then the diff must come down to the actual cooling vs the power being consumed. Pass more air and you get lower temps as you are moving more of the heat, or monitor the heat in a diff way and you get a diff lower reading.
So the actual heat created when expressed as heat energy would be the same per electricity watt from every card? Unless I am missing something fundamental? So the actual heat of the chips themselves really doenst mean a lot other than affecting overclocking potential etc.
So if thats all true why do people even mention heat, surely its a completely irrelevant fact. By measuring power your measuring the energy being consumed which is being turned into heat, but if a card was running at 2000C because it was a sealed box its no more of an issue than a card running at 30C because its liquid nitrogen cooled. Thats assuming that the heat is transferred away from anything that may be affected by the heat, eg a processor. Years ago when you started to get hot cards that did not vent externally I could see the issue as you HAD to have good case cooling, now with most cards venting externally do you really care about the heat generated?