• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Shader clock not sticking (BIOS overclock)

Associate
Joined
7 Jan 2011
Posts
497
Location
Australia
Hi folks,

I've got a GTS 250 that I have done an overclock in which it has been saved into the BIOS. The problem is that although it is saved, it still appears to revert to a lower setting. Click here to see an illustration of my problem.

Out of the several models offered by Gigabyte I own this one.

I've managed to increase the clocks from the original values:
Core Clock: 675 ^ 770
Shader Clock: 1000 ^ 1150
Memory Clock: 1620 ^ 1850 (This one won't stick despite being safe. It reverts to 1836 in the BIOS with the default value being 1850).

Has anyone had any experience with the Shader clock being locked? Is this a safety or security mode?
 
Anyone?

I've been using the nibitor + nvflash combo. Something interesting as well, GPU-Z has detected my card is 65nm when I believe it's meant to be 55nm. I thought the GTS 250s were based on the 9800 GTX+ which is 55nm, not the 9800GTX (65nm).

It would make sense though because my stock clocks are VERY similar to that of a 9800GTX.

Could it be that with the range offered by Gigabyte (there are 3 cards in the 1GB 250 GTS range) is a mixture of a 9800 GTX for the lower end GTS 250 (the one I own), while the Zalman cooled and OC version is based on the 9800 GTX+?
 
I think I know why I am unable to obtain a higher overclock.

It seems I have been sold a GTS 250 which is actually a rebadge 65nm 9800 GTX, not the 55nm 9800GTX+. Some deeper searching turns up that I may not be the only one which leads me to ask, if you own a GTS 250, can you please post whether your card is using a 55nm fab process or a 65nm.

Techgage are saying GPU-Z is not reading the fab process correctly. I have doubts and here's why. The GPU-Z should have been updated since the article so the version I ran should correctly detect whether my card is 65nm or 55nm.

Furthermore my stock clocks correspond perfectly to that of a 65nm 9800GTX rather than that of a 55nm 9800GTX+ which is what my GTS 250 is supposed to be. I have doubts on whether all GTS 250 graphic cards are 55nm. There are even some ASUS GTS 250 cards which need 2 PCIe connectors as shown here.

I thought one of the requirements for a GTS 250 is that it only needs 1 PCIe power connector, this one clearly requires 2. So much for the smaller sized, power efficient and leaner GTS 250.

I thought my GTS 250 would be a 9800 GTX+ 55nm but it seems it's actually lower and older, it's a 9800 GTX 65nm.
 
Thanks for your contribution on to the thread. I noticed in regards to the voltage my card runs at 1.15 as opposed to the normal 1.2 (ASUS run at 1.3). Deep down I'm happy with the card because I got it for a great price and it's a brand name that has a large warranty behind it. It's just, I wish it could overclock so it's similar to others in the range.

I noticed some of the "Green" versions of the GTS 250 too.

In regards to the set ratio, can you provide more info on that? Reason being is that 1836 seems the max that some cards are allowed to run at. Is this perhaps a security setting. Looking at the original stock clocks and comparing to what I have got, I suppose it's not so bad. I just wanted more.
 
No problem, had a lot of info on the subject as I was optimizing my GTS 250 E-Green a few weeks ago.

I've never heard of the shader being limited via bios, but it does only go up in units of 54Mhz and if you set a value inbetween these fixed units, it will round up or down to the nearest one. So if you want higher than 1836 (which I doubt will give you any performance gain) you'd need to set 1836+54=1890. Try that.

The core clock does limit the range to what the shader will do. Increase the core means you increase the range you can set the shaders I think, but your core is already quite high at 775 so it shouldn't limit your shader.

Ideally, if your card can do 738/1836/1150, I'd be happy with that. It is almost as fast as a 9800GTX+. Pushing too hard might be bad for the board - it won't be designed to run at the same overclock level as 9800GTX+, as they had beefier voltage regulation, larger PCB and two PCI-E sockets. It is possible for this reason the shader core is limited, to prevent damage, but usually you just get crashing if it can't maintain voltages.

Personally, I'd run it 725/1782/1150 as I think the core and shader will be memory limited anyway, but that's just a guess going by what I found.

Firstly I want to commend you on your post.

Link relevant

You know. Normally I'd be happy with a card. Yes I did get it to overclock and the overclock is good, but I want something GREAT. With Gigabyte offering the 3 year warranty as opposed to a generic brands 1 year warranty I'm happy to take the risk.

Looking at my card it has a chunky heatsink and a fairly good fan so I'm confident cooling will be no issue. In regards to the components they look fairly solid too. I'm going to up the voltage from 1.15 to 1.2 and increase the clocks a little and see whether I get any performance gain then report back.

I remember reading a fair bit about the green card (Palit, Gainward or Sparkle?) but I'm not sure why anyone would want it. If you wanted something for a HTPC wouldn't you prefer a fanless solution which used less power GT 220?
 
Okay some news. You are right about the 54MHz shader increase. I have been getting some minor artifacting when I increase the shader clock from 1836 to 1890. I'm not sure if increasing the voltage will prevent that as I was always of the belief that increasing the voltage only helped with the core clock, not the shader clock.

So much like overclocking a CPU all the clocks here must work in harmony. I guess this is why shaders are linked to the core clock, right?

But yeah I echo your thoughts on the G92. This is actually my spare PC but I love the G92 and it's such a great card, even now. It stemmed from it's original card to later cards which became more efficient. A lot of the flack was negative, unfairly so because this was basically a high end 8800 series card at an 8600 price which could run on a 450W PSU.

8800 series are one of the best supported cards and I've never run in to any driver issues. Loved the card. Same with my CPU, it's based off a Wolfdale. Reason I bought these parts is they are based off more premium parts but are dirt cheap and can be overclocked to increase performance.

So as of right now, increasing the voltage from 1.15 to 1.2 and seeing if I can get more stable clocks without artifacting.

EDIT:

Results are in and I've been able to match the higher clocks as seen online (GTX+/B] territory). Temps have shot up but are still safe (maxing out at 80c). An increase of 500+ 3DMarks over my previous result and comparing to the stock card settings 2015 3DMarks. The voltage increase may have helped but the key factor was probably balancing out the clock settings to all compliment each other. Thanks so much for your advice, it's greatly appreciated.

Just how did you know about the 54MHz incremental brackets?
 
Last edited:
No problem - thanks to you for getting back with your results - so many posters never bother returning to a thread with their conclusions! :)

My card had an app called vtune, which allowed me to play with the clocks in Windows on the fly. Very useful as I could find the optimal settings quite quickly before flasing the bios.

In vtune, I simply noticed it would only jump up in 54Mhz increments on the shader. :D

Nice. never heard of vtune. I used afterburner + furmark for finding what was suitable for me. I might start folding on my machine, I just hope my power bills don't go through the roof.

Heads up if you haven't heard of afterburner, it also lets you do increments on the fly in windows, and seems compatible with ATI and nVidia solutions from all card manufacturers.
 
Back
Top Bottom