I think it's time CPUs and GPUs had an overhaul for the power consumption usage during idle times. I can understand crazy usage when in use, but when idle I don't understand why the wattage is so high. A simple solution for GPUs would be to go back to the Voodoo card era, when graphics acceleration was either on or off.
I know games have become more demanding and when playing them I expect to be shoving coal into my powerhouse of a PC to play at 10000 fps.. but when I'm reading a web page or in irc, what exactly is my CPU doing that justifies it using so many watts? Ideally, on a CPU that uses a max of 100w, 1% CPU usage should require 1w. That idle usage is going up too; 5 years ago I was paying less electric to read a web page than I am today (and creating less CO2).
Increasing power can not continue for much longer or pretty soon our PCs will literally be like portable heaters. I think the word I'm looking for here is efficiency!
I know games have become more demanding and when playing them I expect to be shoving coal into my powerhouse of a PC to play at 10000 fps.. but when I'm reading a web page or in irc, what exactly is my CPU doing that justifies it using so many watts? Ideally, on a CPU that uses a max of 100w, 1% CPU usage should require 1w. That idle usage is going up too; 5 years ago I was paying less electric to read a web page than I am today (and creating less CO2).
Increasing power can not continue for much longer or pretty soon our PCs will literally be like portable heaters. I think the word I'm looking for here is efficiency!
