It clocks pro-rata to the core speed - increase core by 10% and the shader clock speed will increase by 10%
Although the above is not strictly true since with 8800 cards both the core and shader clocks jump at certain intervals.
See the following thread:
http://forums.overclockers.co.uk/showthread.php?t=17756439&highlight=8800+shader+clock
And from my own results:
1st column is setting in ati tools
2nd column is actual core setting of card
3rd column is actual shader speed
623 621 1539
624 626 1539
627 626 1548
630 630 1548
631 630 1548
632 630 1555
633 632 1555
637 634 1566
640 637 1577
There is some kind of trend going on. Sometimes increasing core has no effect on actual core and shader speed. When it does the shader speed jumps in multiples of 7, 9 or 11. It seems that every setting of core speed is not possible either.
And more details with a 8800 GTS here:
http://www.xbitlabs.com/articles/video/display/msi8800gts-640_7.html
I'm not sure whether I get different results because mine is pre-overclocked and maybe at a different shader/core ratio than the one at xbitlabs but overall it is best to use whatever you want to alter the core clock speed (ati tools, rivatuner etc, nvidia control panel) but use another program to actually report what you get in real life.
Also, you can alter the shader seperately in bios to the core but once done, the shader speed will still alter in proportion (sort of) to the core, to the above rules, which nobody seems to have come up with an exact formula yet.