• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Gainward's Expertool Now Clocks Shaders :D

im guessing that overclocking core, shader or mem will all increase heat, but which is most important to overclock? like would sacrificing some shader speed for some more core speed be good? or is it best to overclock all evenly?

thats exactly what im wondering :p
 
does this save your settings so its default at startup or not? cause i have found my GTS IMPOSSIBLE to overclock in vista and keep it stable, will higher shader clock give large performance boost?


Yes it does.

First attempt:

Gainward Bliss 8800GTX (Stock Speeds = GPU 575MHz, RAM 1800MHz, Shaders 1350MHZ)

Overclocked:

GPU Core : 600 MHz
Memory : 2000MHz
Shader : 1500MHz

3DMark 06 Result:

http://service.futuremark.com/orb/resultanalyzer.jsp?projectType=14&XLID=0&UID=12206623

I used to get around 1250 Dead so its gone up by around 900 Marks.

Just a note: I dropped my q6600 (G0) from its overclocked 3.6GHz (450 X 8) back to stock to make comparisons so 3Dmark score will be much higher once I clock the CPU back up :)
 
im guessing that overclocking core, shader or mem will all increase heat, but which is most important to overclock? like would sacrificing some shader speed for some more core speed be good? or is it best to overclock all evenly?

The release notes say that the new Expertool has overheating protection alarms or something like that:

http://www.gainward.com/news/news_detail.php?news_id=13

Should be fine upto about 90 degrees anyway mate. Take a look at the fan control tab in expertool and set to fixed speed 100% if you're worried :)

In other words......clock 'em all!!!! :D:D
 
Last edited:
yep what dave said.

gone from 500 core, and 1350 shader 1600 memory
to
670 core 1675 shader and 2000 memory. fan set to 69% (70 becomes audible) sets are mid 60's when under stress.
 
i beleive shaders is the best, followed by core then memory.

This depends on the game i think however in the newer games where shaders are a lot more complex then it is best to do these first. I know for me the shaders are the main limiting factor in crysis, and this will prob be true for other games as well.
 
Another useful tool for you seeing as everyone reading this is now hell bent on clocking the nuts off of their GFX card: :D

gpuz.jpg


Link:

http://www.techpowerup.com/downloads/843d/GPU-Z.0.1.1.exe

:D
 
All very interesting...........but I want to see are some benchies for shader overclock vs core overclock and the effect on heat that each of them has.

Crysis seems the perfect tool for this test.

If no one has time, I'll do it when I get home later in the week. :rolleyes:
 
yep what dave said.

gone from 500 core, and 1350 shader 1600 memory
to
670 core 1675 shader and 2000 memory. fan set to 69% (70 becomes audible) sets are mid 60's when under stress.

670 on the core!?!?? Blimey, what have you done to the card to manage that? (if anything?)

Matthew
 
nope nothing Scougar, it idles at 58C only goes up to a bout 65C after a few hours gaming (using riva tuner to follow the temps), fan is at 69% fixed, stock cooling. just a plain old EVGA GTS 320mb version (not pre overclocked)
 
My GTX gets about 650/1000 stable, I can't remember the shader clock. Is it better to link shaders with the core or raise the shader up?
 
I wonder what the fascination is with hardware manufacturers to attach some ugly skin to the program so it looks "space age", instead of releasing that 90% of people that use these programs would rather have something that looks like a bog standard Windows program and is bug free. It's like "EasyTune" from Gigabyte and "uGuru" software from Abit - I'm sure they aren't that bad, but they look ridiculous.
 
I been using this for a few weeks now and its great. Seems to get me more stable clocks than rivatuner or ati tool.

My GTS 640 is running @ 670/1560/2000 idle about 58c and hits 76c under load.

I have 2 gripes with this tool though.

1) It wont let me clock any higher, The sliders are maxed out at the above settings but i reckon i can get more out of the shaders and memory.

2) It wont save my settings and load them at startup. Shaders and mem speed will but the core clock speed wont it always defaults back to 513 and has to be set again once windows has loaded, :confused:
 
Last edited:
My XFX 8800GTS 320 gets 665/990/1620 stable.

Stock cooling, stock volts, stock card.

Just sold it on the bay though. :)
 
I've been using this for a few weeks already :P

Mine is on
Core clock: 630 Mhz
Memory clock: 1950 Mhz
Shader clock: 1500 Mhz

Pretty nice if you ask me :) Stock cooling
 
Oh God, not again :p..

repost.jpg


;)


1.21 Jigawatts!!!!!!!!!!!!!

lol.

I know its been posted before - just thought it would be relevant to this post and, more to the point, useful.

Never understood the slatings for reposts though. Even with search it can days to make sure no ones posted before and also it assumes that absolutely everybody either knows everything or has read every single post ever.:D

And don't even start me on "**THE OFFICIAL**" threads :D
 
Mmm mines coming this is the not correct dslpay card. would running dual monitors have anything to do with this ?? Really wanna get this working . I'm using vista 64 bit also . Any ideas?

Anyway, back to topic....:D

My GTX is a Gainward but I've heard that Expertool works with most cards. Try unplugging/ disabling one of your monitors.

It is definitely is not a 64 bit issue.

Some Asus cards are reported not to work with Expertool but thats all I can think of mate.:)

Can anyone help Fullhouse out?;)
 
Back
Top Bottom