Fx-Overlord said:
And iirc power consumption varies linearly with frequency but voltage squared.
Yes it does.
A commonly used expression is for power P, P=kLV²F, where L is load (software), V is the core voltage, F is chip frequency, and k is an empirical constant for the chip.
And delta P, change in power. The constants cancel.
P2 = P1 x (F2/F1) x (V2/V1)²
Or a little easier to understand
Overclocked Watts = Default Watts x (Overclocked Mhz / Default Mhz) x (Overclocked Vcore / Default Vcore)²
Say a chip uses uses 130W (-40W average static %) at 575Mhz,
at 500Mhz, 90 x 500/575 = 78W* Plus the 40W = 118W
Or say a chip at 1.30v and a 0.10v voltage decrease.
130 x (1.2/1.3)² = 110W*
*Its a bit of an approximation, apart from dynamic power, theres also static power from leakage and Tjunction temps. These increase with frequency a bit as well, but not in the same range. Also in a graphics card, obviously shaders and memory domains have different frequencies and voltage. If you knew the values for each part you could get a good idea of the total power change.
Here's an online version, where it says overclock just think change +ve or -ve
http://newstuff.orcon.net.nz/wCalc.html