• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Bizarre 8800GTS Underclock Experiments

Associate
Joined
1 Jan 2006
Posts
252
Location
Sheffield
I've been experimenting with UNDER clocking my 8800GTS, so it uses less power and therefore generate less heat and noise when I'm not gaming. I use nTune to change the memory and core speeds, and have set up profiles for overclock, default and underclock.

I normally OC to 600 core and 900(1800) memory from 500 and 800(1600). For the underclock I've gone to 400 core and 400(800) memory for boring desktop stuff.

Earlier I forgot to overclock before I loaded up Oblivion. Now it ran slower and I realised what I had done, but then I thought - hang on... it's not that much slower! Considering the core is running a good 50% slower than my OC, and the memory much less than half, my average FPS wandering aroung Cheydinhal was 25-30 with everything maxed at 1680x1050, HDR and 2xAA!

So I tried lowering further, to 300 and 300(600), and it still ran really well, a few FPS slower, ocassionally going sub 20. And no, my drivers weren't kicking in with proper 3D settings when the game started, as that was disabled, I checked.

With the memory and core of the 8800GTS clocked more than 50% lower than my old 7600GT, its still streaming through the game with everything maxed.

It got me thinking really - these new graphics cards have a lot more to them than just raw core and memory speed, but thats the thing they are sold on - with people spending quite a premium for cards with factory OC'd core and memory, usually just 25, 50 or 100mhz increments, that can't make that much difference.

I know this is probably old news really, but it was quite a revelation to me. I guess it's the same as the move to Core 2 Duo Heaven from Pentium 4 Hell, everything runs quicker, even though its running slower :confused:
 
Last edited:
my average FPS wandering aroung Cheydinhal was 25-30 with everything maxed at 1680x1050, HDR and 2xAA!



FPS slower, ocassionally going sub 20

Ouch! I couldn't play a game with such low FPS, I really notice the lag.

Then again, some people can play ultra laggy and really not mind it... me, I have to play at 60fps + :p

But you're right, graphic cards aren't just clocks speeds. It's also about the pipes, shaders etc etc :)
 
Last edited:
Ouch! I couldn't play a game with such low FPS, I really notice the lag.

Then again, some people can play ultra laggy and really not mind it... me, I have to play at 60fps + :p

Well, normally with the OC I get 30 to 50 fps wandering around the same place, sometimes bobbing down to 25 during load glitches. Don't forget this is a relatively slow paced RPG, and not a sharp shooting rooting tooting shooter.

Still, it's nice to run Oblivion at those sorts of framerates at that res with all settings maxed. Especially after a year of 18-25fps at 1280x800 with much less than max settings and no AA with the 7600GT ;)

Anything more than 50fps is a waste really, the human eye surely can't really discern the difference after that, though it is fun when I go into a cave tunnel and the fps display shoots up to 150!
 
When I underclocked my 8800GTS to have it run cooler on the desktop and to save power, I ran at 150/300, any lower and the screen would go gray and I would need to turn the pc off and back on.
 
I remember when I underclocked my Voodoo Graphics down from 50mhz to 25mhz using that SET SST stuff. Started getting a load of sparklies/artifacts in Carmageddon!
 
I'd guess you'd see a much bigger hit by underclocking the shader units (which are separate to GPU/Memory).

The GTS absolutely flies when you ramp up the shader units to near 8800GTX speeds (GTS 1200Mhz v GTX 1350Mhz), afterall it does have 25% less units than GTX so I'd say 8800GTS is heavily limited by the shader units at stock speeds.
 
Last edited:
Is it true that the GTS's overclocked to 700+ core and 2.2Ghz mem are performing close to a stock clocked GTX?
 
how to monitor the memory tempreture? i've only seen GPU and ambient temps

There is no memory temperature monitoring.

Using ATi Tool is best to find memory problems when overclocking, then try games, if it's unstable lower it slightly.
 
How du i actually clock the shader units?
Been using ATITool, waiting for the signed vista driver but when i get that, i'll try to overclock shader units and see the effect, but this is not possible with ATITool at least, where then?
 
I think it has to be done through a BIOS flash, either by flashing with a non-reference (pre-overclocked) GTS BIOS such as BFG OC2 or by dumping (downloading) your current BIOS and editing it with a program called nibitor.

At GTX clock speeds a GTS isn't much slower than a GTX to be honest, unless it's a really shader heavy game.
 
As you up the core I think the shader clock increases too.

No I don't think so mate, I'm pretty sure that it isn't linked (unless software overclockers increase it in parallel with the GPU).

When editing the BIOS they are definitely separate, you can use Rivatuner to monitor clockspeeds of core/shaders/memory.
 
Last edited:
Back
Top Bottom