Watts out the wall VS heat watts

Associate
Joined
10 Oct 2006
Posts
348
Location
N.I.
Am i correct in assuming that the draw out of the wall is about equal to amount of heat generated ? I have a power consumption monitor, can i use it to estimate needed cooling potential ?

I did some testing on air and my results are as follows:
Idle: ~160 watt
Guild Wars 2: ~430 watt
Skyrim ultra: ~390 watt
Furmark sli default settings: ~730 watt (not really a real world use, but meh...)

i7 2600k stock, 4x4gb ram, MSI Z64A-GD55-G3 mobo, 2x inno3d gtx480, ocz zx 1000w psu... all stock atm...
 
Oh come on, i didnt think it would be necessary to mention energy conservation and thermodynamic laws.

What i'm asking is it a good way of estimating needed cooling potential ?
 
I think it's safe enough to use as a guesstimate.

Reading up on it apparently it has a lot to do with efficiency of the chips as well.

I based my cooling on power draw +20% at load.(for future games/progs)

I didn't use furmark though. (That program's insane)

A triple rad like hardware labs black ice with some good fans should be fine for that wattage(according to the box I'm looking at)
 
Oh come on, i didnt think it would be necessary to mention energy conservation and thermodynamic laws.

Bit confused? I was agreeing with your statement?
You can package the problem anyway you want, but as you mentioned so surely understand, it does come down to those basic laws.

Of course the wholly unscientific way to base possible data on is compare other people results on PC forums! That is a fun way though!
 
The only thing though is I presume you won't be water cooling mem,mobo,hard drives and so on so the wattage from wall would be more than you actually need to.
 
Hard drives after spin up consume next to nothing, dont know about ram and mobos. Mobo probably would account for some significant heat tho...
 
Maybe test the difference between idle at stock, and high load during overclock? As the CPU will be the main thing under load, it'd give you a better idea. Then just add a percentage to be safe.
 
Best way to decide on how may rads is how many you can fit in your case!
Once you have spent on the nessecery rads, blocks, pump, and fittings, then another radiator isn't a great amount more.
The more the better it always seems!
 
Its all theory at this point, i'll be mounting rads/fans on the outside anyway...

Now all this made me think: "how efficient todays personal computing is ? What percentage of electricity is wasted to produce heat ?"
 
Well, it'd be Total power in LESS the power actually used by the components to do their jobs.
If the heat output was nearly equal to power in at the wall, your room would be like the footplate of a steam engine...
I believe the figure is around 10% for a system running flat out, i.e benchmarking, dropping to a couple of percent for idling.
No manufacturer would ever release components which convert most of their power to heat. Except heater manufacturers...
 
I thought almost all energy used was radiated as heat, at least from the circuitry.

In the HDD's it is converted to motion, although that creates friction and heat, in LEDs it is converted to light, but other than that mainly heat caused by resistance.

The more efficient the conductor, and the lower nm the transistor, then the less resistance is generated, so less heat, which means less power needed.

Interestingly, conductors are more efficient when cold (as resistance drops) so technically... a well cooled PC could use less power. As long as you don't use a lot of power to achieve the cooling though. The difference would also be marginal.
 
I may be quite mistaken in my thinking.
Incandescent light bulbs are only about 4% efficient (hence they're being slowly phased out), i.e 100W in, 4W of light and 96W of heat generated.
I personally just can't see a PC being that inefficient. But I am sometimes wrong!
I just think that if nearly all the power used was radiated as heat, we'd all be sat at our screens sweating, and central heating would be unnecessary...
 
I can understand what you mean, it's scary to think they output that sort of waste.

The way I look at it is that all energy has to be transformed into another form. Which leads me to believe that it must be output as heat seeing as there's no chemical/movement/sound/light.
 
If you think about it, LEDs produce light, hard drives and fans produce motion, but other than that what else does the computer actually create?

All it does is run electricity round in circles inside the computer, from part to part, up and down cables at very high speeds. It doesn't actually create anything from it so it is purely electrical energy. All it wants to do is find its way to ground, and we force it to go through an obstacle course on the way, switching transistors, powering fans, delivering signals etc. As it goes it encounters resistance, and is forced to give up some of its energy in heat.

This is why they spend so much time and money on researching superconducters such as carbon nanotubes - substances which can conduct electricity as efficiently as possible with very low resistances, carrying huge loads without generating much heat. Use something twice as conductive as copper, and you waste half the power. It's big money and alternatives do exist, but unfortunately the only way to mass produce computer parts because of the sheer number of them required is to use a common ore, i.e. copper.

One day they may even make computer parts out of artificial diamond, as that is very efficient apparently, or the carbon nanotubes, but for the moment, we're stuck with what we got ;).

Back to the topic at hand though, you could always use something like coretemp which gives a rough estimate of CPU power usage, and run intel burn test to see how high it goes. Other than that, if you know how much your computer uses as a whole, then measure the difference between full load and idle, and add about 20-30 watts maybe... as they're pretty efficient while idling these days.
 
If you think about it, LEDs produce light, hard drives and fans produce motion, but other than that what else does the computer actually create?

All it does is run electricity round in circles inside the computer, from part to part, up and down cables at very high speeds. It doesn't actually create anything from it so it is purely electrical energy. All it wants to do is find its way to ground, and we force it to go through an obstacle course on the way, switching transistors, powering fans, delivering signals etc. As it goes it encounters resistance, and is forced to give up some of its energy in heat.

This is why they spend so much time and money on researching superconducters such as carbon nanotubes - substances which can conduct electricity as efficiently as possible with very low resistances, carrying huge loads without generating much heat. Use something twice as conductive as copper, and you waste half the power. It's big money and alternatives do exist, but unfortunately the only way to mass produce computer parts because of the sheer number of them required is to use a common ore, i.e. copper.

One day they may even make computer parts out of artificial diamond, as that is very efficient apparently, or the carbon nanotubes, but for the moment, we're stuck with what we got ;).

Back to the topic at hand though, you could always use something like coretemp which gives a rough estimate of CPU power usage, and run intel burn test to see how high it goes. Other than that, if you know how much your computer uses as a whole, then measure the difference between full load and idle, and add about 20-30 watts maybe... as they're pretty efficient while idling these days.

Good theory about the super conductors and whatnot, but in practise it does not work like that. Wouldn't it be wonderful if you could make a computer with no resistance and no waste heat!
But when you think about it, in a CPU, it effectively uses millions of transistors acting as switches to control the current flow. However, they are transistors, not switches. So by there nature they do not stop current breaking a circuit, they do so by creating very high resistance, hence the problem of heat.
It's sad really that current transistor based chips will never be heat free or even close to that. So bring on quantum computing! :p
 
All the electrical power going in one end comes out as heat. All of it. You've said "estimate" and "thermodynamics" so I'll provide slightly more detail than "buy the biggest radiator that will fit".

Estimating cooling. You can find Kelvin/Watt figures for radiators using various fans online, "martin's liquid lab" for example. It's usually 0.06 K/W/(120mm radiator). A 200W heat source and one 120mm radiator should expect a temperature rise over ambient of 0.06 * 200 = 12 degrees. Thus you can work out how many radiators to attach to the graphics cards and how many to the cpu. Similarly if using one loop.

The critical point is that you have to decide roughly how hot to let things run when choosing the heatsink / radiators. A single 120mm radiator will cool everything fine if you don't mind it running at approximately (500W*0.06) 30 degrees over ambient.

Hope that's coherent enough. Ambient is probably 300K, give or take.
 
Back
Top Bottom