• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The first "proper" Kepler news Fri 17th Feb?

Soldato
Joined
31 Jul 2004
Posts
3,730
I realize now that my post may not have been very clear. What I meant was, yeah, when you increase load you will see higher power draw. The reason for this is because there are more circuits active. Modern CMOS circuits, especially the modular ones like processors, use transistors to switch off chunk of circuitry at low loads.

But for a given circuit the power draw varies with frequency. What happens when you increase the load is more circuit elements switch on, so while you had 10 % of the chip drawing power before, you may now have 50% of it drawing power.

So that's a different cause altogether from what happens when frequency is varied. But for any given circuit in the on state (i.e. not power-gated or switched off), increasing frequency will increase power draw.

Thanks for that ;) , so if they used all the elements at a lower frequency, when maximum performance was not required would that use less power for a given load ?
 
Soldato
Joined
24 Jun 2004
Posts
10,977
Location
Manchester
Not big enough to pull those biased benchmarks !

Do you see what I mean now? Over the past two days stuff has been getting pulled left right and centre. But when it suits them?

Or they simply don't have any control over certain sites? Only those who are under NDA are subject to Nvidia's wrath...

confucious say said:
"Never attribute to malice that which can adequately be attributed to incompetence"
:p

Besides, it doesn't appear that the early leaked benchmarks were particularly far off the mark... We will need to wait until tomorrow to find out for sure, but it doesn't seem like they showed the GTX680 in a disproportionately good light.

This stuff happens at each and every GPU launch. This release hasn't been particularly out of character...
 
Soldato
Joined
31 Jul 2004
Posts
3,730
Yes :)

(in comparison to using all the elements at the higher clock frequency)

Thanks, It got me thinking back to ATI's "power Play" which did not work properly.. I can't even remember which cards it was first run on, but even quite recently they had to lock the memory clocks to avoid flickering on 4800 cards at power state changes.

So plenty of potential power savings for these new high frequency cards if you have the right control logic.
 
Soldato
Joined
30 Mar 2010
Posts
13,050
Location
Under The Stairs!
Why does the Nvidia guy in the video state that 'it's the fastest and most efficient gpu we have ever built' and not claim that it's the fastest gpu period, if we are to believe the latest rumours?

Just out of curiosity mind, I very much doubt I'm getting either effort this time round.
 
Back
Top Bottom