Why is idle power usage so high? where will it end?

Associate
Joined
23 Aug 2005
Posts
1,274
I think it's time CPUs and GPUs had an overhaul for the power consumption usage during idle times. I can understand crazy usage when in use, but when idle I don't understand why the wattage is so high. A simple solution for GPUs would be to go back to the Voodoo card era, when graphics acceleration was either on or off.

I know games have become more demanding and when playing them I expect to be shoving coal into my powerhouse of a PC to play at 10000 fps.. but when I'm reading a web page or in irc, what exactly is my CPU doing that justifies it using so many watts? Ideally, on a CPU that uses a max of 100w, 1% CPU usage should require 1w. That idle usage is going up too; 5 years ago I was paying less electric to read a web page than I am today (and creating less CO2).

Increasing power can not continue for much longer or pretty soon our PCs will literally be like portable heaters. I think the word I'm looking for here is efficiency! :)
 
Yup, Intel CPUs can scale back to some 10 Watts (even relatively high end ones). ATI also has powerplay on their graphics card now.

Modern computer components are efficient relatively and there is a drive to be better. Penryn is faster and more power efficient than the generation before it, as was Conroe (admittedly pre-65nm Pentium 4s were rather extreme but you get my point, we are heading in the right direction).

If you want the computer power you pay the price, that is the way it is. If you really want to lower the bills then why you don't buy one of those old rigs, a few sticks of RAM and surf the web on it? Alternatively you could get something like Intel Atom or a Via ITX PC. A KVM switch does the job nicely when you want to switch between gaming power and web surfing, there is a limit to how low technology can go and retain the performance.

To illustrate the point:
That idle usage is going up too; 5 years ago I was paying less electric to read a web page than I am today (and creating less CO2).
Why did you upgrade then?
 
Nvidia Gpu's lower there clocks when they are only displaying 2d images so that must save some juice.
 
The 790GX will switch to onboard and lower clocks when not doing anything in 3D. But it's not like any GPU's or CPUI's are drawing full power all the time.
 
Why did you upgrade then?

Same as most people here: to play modern games. I only have 1 PC, for everything, but I do like the idea of those more efficient Atom/VIA PCs. I think the idle power has crept up over the years, but only recently has there been pressure to actually do something about it - less CO2 and rising energy prices.

yeah it's improving and more research is being done and will need to be done. It's just interesting to me that a PC 10 years ago could surf, play mp3s, use notepad and other low CPU usage tasks and would have used less power too. The maximum power usage is rising, but so is the minimum.

Yeah I read about people turning Speedstep back on once they found a stable overclock. Or can it cause crashing?
 
Back
Top Bottom