Increasing performance = Increasing bills?

Already increasing clock speed rises power consumption: One part of consumption is leak current which depends on temperature but operating consumption is directly related to clock speed.
As for voltage power consumption is in direct relation to square of voltage. (P=UI | I=U/R > P=UU/R)
And if you disable power saving modes that has effect where it matters quite a lot because typical PC spends most of it's time idling/under very light load.

Also "modern" GPUs are pinnacle of sh*tty engineering and many of them consume more for doing nothing than some quad core CPUs under full load.
http://xbitlabs.com/articles/video/display/evga-geforce-gtx260-216-55nm_5.html#sect0
(five years ago 25W idle draw was high for graphics card)
 
evga260_power.png


so gtx280 more economic while off 3d than Hd4870, That's good...but while gaming hungry...
 
Already increasing clock speed rises power consumption: One part of consumption is leak current which depends on temperature but operating consumption is directly related to clock speed.
As for voltage power consumption is in direct relation to square of voltage. (P=UI | I=U/R > P=UU/R)
And if you disable power saving modes that has effect where it matters quite a lot because typical PC spends most of it's time idling/under very light load.

Also "modern" GPUs are pinnacle of sh*tty engineering and many of them consume more for doing nothing than some quad core CPUs under full load.
http://xbitlabs.com/articles/video/display/evga-geforce-gtx260-216-55nm_5.html#sect0
(five years ago 25W idle draw was high for graphics card)

what.? :confused:

lol
 
Back
Top Bottom