• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

8800GTX OC'ing

Associate
Joined
28 Mar 2007
Posts
2,499
Location
Edinburgh
My spec is as per signature. Never OC'd graphics before and fancy trying to boost my 3dMark06 score which currently sits at 12,308.

I've downloaded Rivatuner and had a look through all settings etc last night and it all seemed very straight-forward. In fact, too straight forward.

I've searched through the forums for a guide and there's several useful posts, but all mostly the pre-ability to modify Shader clock separately. Am i right in saying that I should:

1. Find max GPU clock
2. Find max Memory clock
3. Find max Shader clock

In that respective order?

I'm not interested in squeezing every last drop out of it but simply looking to up the spec. I'll be OC'ing on standard (but effective) cooling and do not want to alter Voltage, PCI-e FSB etc.

Would I be better keeping the GPU and Shader clocks linked? Only stress tester I've got is 3DMark06. What would be best to tell if the OC is stable? ATITools?

Cheers.
 
I'd leave them linked if you not fussed about sqeezing every last drop out.

Once you find your max core you can always try upping the shader seperately.

ATi tools has a 3d stress/error detection thing built in but I find it shows up errors too soon. Take mine, it shows errors from 631 upwards and just frezzes straight away at 650 yet I can set it to 650 with rivatuner and loop 3dmark06 all day long.

But ATI tools will find you your "safe" maximum overclocks. Whether you choose to take it further is up to you.
 
Yeah, I had picked up on the fact that ATi can show "false" errors that would not show in other 3D apps/bench's.

I guess it's case of seeing how far you want to go with Riva/ATI and then actually test in games, right?

I'm assuming that if everyhting looks OK in an intensive game such as Bioshock, then that would be a decent benchmark for other games?

Ta.
 
World in conflict seems to stress the gfx oc to breaking point. People have posted on here with perfectly stable oc in all other games but they need to turn it down in WIC so you may want to try the demo of that.

But yeah, test in stressful games and look for graphical glitches.
 
I have a GTX OC can I ask what speeds people are clocking there cards too?

Including shader?

Mines

642 - Core
1660 - Shader
1003 - Memory

Giving me 12327 on 3Dmark 2006

Thanks
 
Last edited:
Back
Top Bottom