• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

ForceWare Vista 163.69 BETA

lost over 1k 3dmarks with the xp drivers over the 163.67s (And yes, i did sort my shaderclock out)

Went back to 167.67s after performance loss...and i now have still got the loss.. gah
 
Unwinder claims the Core/Shader relationship only affects Vista not XP, cant say as not on XP right now or he may mean his APP works diff in Xp and Vista now.
 
LOL

Well I ignored these drivers as of same issue as last beta ones with static Shader, but as Rivatuner 2.04 came out today I can enter -1 and get my Core/Shader Ratio back as before, I aint going to mess about with it as I do not have a clue whats limit of the Shader.

Accordint to one guide I read where they flashed the bios, the max shader is 1780.

And although that was just one GTS they had, they found it could clock to that speed stable as well but the core wouldn't go over 650 which only gives you about 1600 shader clock.

I'd be quite interested in altering the ratio or in the next rivatuner having a seperate slider as I'm sure I can get more out of my shader core.

Also they stated that for every % increase in shader clock gave a bigger boost to fps than the core clock or the memory clock. Obviously with the old beta drivers, your shaders went up with your core which is where most of the fps gain came from. Once this remained static with these (and last) betas that's why most people are showing a big drop in benchmarks even if they overclock their core as high as it will go.
 
Yeah I did some testing in 3dm06 with the previous two sets of drivers, the ones where the shaders were locked scored around 300pts less on my GTS320.

Would be good to have asynchronous clocks for core, mem and shaders to help us get the max possible performance.
 
yeh in the uningine benchys in another thread i done a couple of tests and gained 10fps-13fps in d3d9 and opengl benchmarks with a shader oc rather than just a core oc
 
been fiddling around with rivatuner and the shader oc ratios have got mine set to 2.60 so with a 25mhz oc on the core my core is at 575mhz and my shader rises to 1512mhz a shade off of what it would be if i had 650mhz on the core.
this resulted in a 1 degree change in core temp's wich are sitting at 56c.
gonna try a benchmark and see what sort of performance this offersthen fiddle around and see what can be done with this.
 
Well as I set my Core to 700mhz to game, it is really 702mhz and the Shader is at 1782mhz (same ratios as it was on older drivers as I have set to -1 on these new drivers).

So how high can I push the shader as its kinda a grey area.
 
there isnt a heat issue with it in full load with mine at settings above its touching 72c with fan at stock so just a case of its own physicall limitations. will it manifest itself sharp and suddenly or let you know its there like a normel oc to far.
 
Hopefully the next build of ATI Tool will let us see artifacts on OC'ing the Shader alone (unless it does now) and will work fully in Vista64.
 
Last edited:
hmm it does but when i tested it instead of throttling back when it discovered artifacting it carried on oc'ing my desktop locks turned to squares then my screen went black, needless to say i riped the powerd cord from the back restarted the pc and thankfully all was well.
you proberbly guessed it but im a little dubious about using ati tool with my 8800 at the mo.
 
Back
Top Bottom