• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Disable GeForce GTX 580 Power Throttling using GPU-Z

Soldato
Joined
6 Oct 2007
Posts
23,104
Location
North West
NVIDIA shook the high-end PC hardware industry earlier this month with the surprise launch of its GeForce GTX 580 graphics card, which extended the lead for single-GPU performance NVIDIA has been holding. It also managed to come up with some great performance per Watt improvements over the previous generation. The reference design board, however, made use of a clock speed throttling logic which reduced clock speeds when an extremely demanding 3D application such as Furmark or OCCT is run. While this is a novel way to protect components saving consumers from potentially permanent damage to the hardware, it does come as a gripe to expert users, enthusiasts and overclockers, who know what they're doing.

GPU-Z developer and our boss W1zzard has devised a way to make disabling this protection accessible to everyone (who knows what he's dealing with), and came up with a nifty new feature for GPU-Z, our popular GPU diagnostics and monitoring utility, that can disable the speed throttling mechanism. It is a new command-line argument for GPU-Z, that's "/GTX580OCP". Start the GPU-Z executable (within Windows, using Command Prompt or shortcut), using that argument, and it will disable the clock speed throttling mechanism. For example, "X:gpuz.exe /GTX580OCP" It will stay disabled for the remainder of the session, you can close GPU-Z. It will be enabled again on the next boot.

As an obligatory caution, be sure you know what you're doing. TechPowerUp is not responsible for any damage caused to your hardware by disabling that mechanism. Running the graphics card outside of its power specifications may result in damage to the card or motherboard. We have a test build of GPU-Z (which otherwise carries the same-exact feature-set of GPU-Z 0.4.8). We also ran a power consumption test on our GeForce GTX 580 card demonstrating how disabling that logic affects power consumption.

http://forums.techpowerup.com/showthread.php?t=134458

http://forums.techpowerup.com/attachment.php?attachmentid=38939&d=1289670674
 
From what I can tell just furmark and the like, they stress the card way more than crysis does which is one of most demanding games out there.

350w, be Jesus!

Yeah no wonder they wanted to cheat furmark, so does this mean the 580 has a higher TDP than a 480?
 
I really don't reccomend this unless you really really know what your doing - not just that you think you know what your doing... or don't be suprised when you end up with a smell of burnt electronics and melted power connections.
 
And maybe the enthusiast community... like say the OcUK forums :p

Assuming that the throttling is only limited to Furmark and one other burn program, at least that is according to NVIDIA, then the use of it is limited to seeing how hot they get and how much power they use in those pieces of software.

If NVIDIA said it was limited in games and performance benchmarks then I would see a point for overclocking community.


Yes, NVIDIA could be just lying about it and hopefully review sites test it out on benchmarks.
 
Ah so we still do class this as an enthusiast forum, you would think not with all the moaning about power consumption we seem to of had for the last six months or so ;)
 
Ah so we still do class this as an enthusiast forum, you would think not with all the moaning about power consumption we seem to of had for the last six months or so ;)

With the interest in this thread, and recently covered forum issues, I'll have to say this WAS an enthusiast site :p
 
Last edited:
So who's going to be first to disable it and run an extreme burn loop in furmark and report back on how long the card lasted before it sent out smoke signals?

We could start a league table :D
 
I guess the issue you might face if pulling over 300W is:

PCIe 6 pin is rated at 75w
PCIe 8 pin is rated at 150w

Whilst you can pull more load over these cables than that if you exceed the rated value of the cable it could melt. It entirely depends on the PSU maker as to what the rating of the cables is.

I also wonder if it will start to pull more than 75w from the pcie slot itself, if it does then that sounds very bad
 
Last edited:
I guess the issue you might face if pulling over 300W is:

PCIe 6 pin is rated at 75w
PCIe 8 pin is rated at 150w

Whilst you can pull more load over these cables than that if you exceed the rated value of the cable it could melt. It entirely depends on the PSU maker as to what the rating of the cables is.

I also wonder if it will start to pull more than 75w from the pcie slot itself, if it does then that sounds very bad

Wouldnt the psu's overload protection kick in before you started a small in case fire?

EDIT: Actually thinking back probably not as I recall turning a floppy power cable into a lightbulb filament once by accidentally shorting it on the case and watching it glow red and disintegrate in front of my eyes. My old enermax psu didn't care one bit!
 
From what I can tell just furmark and the like, they stress the card way more than crysis does which is one of most demanding games out there.



Yeah no wonder they wanted to cheat furmark, so does this mean the 580 has a higher TDP than a 480?

Would appear so, not too unsurprising with the very similar load power in games , faster memory, more heavily loaded mem controller(due to mem speed increase), more SP's.

Thing is, does the throttling only happen in two bits of software, this is what I'm not clear on, does it stop those to programs pushing a stock 580GTX past 300W, or does it stop the card going beyond 300W, which doesn't matter for any other software, but overclocked limits how far you can overclock?

IE, is it only not throttling say Crysis because at stock the 580gtx gets no where near 300W, but an overclocked card would get throttled in Crysis?


Either way, would I complain or have a go at Nvidia(except for the stupid power usage :p ) for Furmark, no because for MONTHS I've been telling people its a completely worthless piece of software.

Who cares what you're stable at in Furmark, if EVERY GAME EVER can be run at higher settings, why do you care what temp your overclock runs at in Furmark, if EVERY GAME EVER will run cooler, giving you more headroom for your overclock, etc, etc, etc.

Furmark was for a year described as a power virus, nothing more or less, it doesn't mimmick the power usage of any known software, including GPGPU/heavy workload stuff because, its designed with no intention other than to put an un-natural load on the gpu.


AMD were forced to add circuitry to stop the 4870 crashing in Furmark(well to stop future cards) and Nvidia have now been forced to add circuitry to prevent insane usage in Furmark.

The VERY simple answer is, don't use Furmark, all around, and neither company would be spending time trying to prevent people frying their cards using a piece of software that offers you literally nothing, it offers no useful information other than the stable limits you can run Furmark at.
 
I like how ATI were savaged for doing this on a mid-range card yet this is the first I'm hearing of this with the 580, which is an enthusiast card eyed by extreme benchers and competition clockers.
 
Worthless? lol ok so a piece of software that can work the processor so hard that it goes up in flames is worthless?

You better hope the next car you buy doesn't use that approach to crash tests!

;)
The difference is, it's your graphics card that 'goes up in flames', just as they don't crash your car in crash tests.
 
The difference is, it's your graphics card that 'goes up in flames', just as they don't crash your car in crash tests.

No need to crash 'your' car because the manufacturers have already been legally obliged to do so using one of their own cars.
The point is that it IS a worthwhile test because loading a processor to 100% should never lead to destruction or massive instability of a GPU (doesnt matter if its YOUR processor or one of the manufacturers samples).

If it can't handle 100% load then it is NOT stable at that clock speed.
 
Back
Top Bottom