If you think about it, LEDs produce light, hard drives and fans produce motion, but other than that what else does the computer actually create?
All it does is run electricity round in circles inside the computer, from part to part, up and down cables at very high speeds. It doesn't actually create anything from it so it is purely electrical energy. All it wants to do is find its way to ground, and we force it to go through an obstacle course on the way, switching transistors, powering fans, delivering signals etc. As it goes it encounters resistance, and is forced to give up some of its energy in heat.
This is why they spend so much time and money on researching superconducters such as carbon nanotubes - substances which can conduct electricity as efficiently as possible with very low resistances, carrying huge loads without generating much heat. Use something twice as conductive as copper, and you waste half the power. It's big money and alternatives do exist, but unfortunately the only way to mass produce computer parts because of the sheer number of them required is to use a common ore, i.e. copper.
One day they may even make computer parts out of artificial diamond, as that is very efficient apparently, or the carbon nanotubes, but for the moment, we're stuck with what we got

.
Back to the topic at hand though, you could always use something like coretemp which gives a rough estimate of CPU power usage, and run intel burn test to see how high it goes. Other than that, if you know how much your computer uses as a whole, then measure the difference between full load and idle, and add about 20-30 watts maybe... as they're pretty efficient while idling these days.