• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Hybrid Sli 8500GT + 8200 chipset (8600)

Associate
Joined
19 Jan 2009
Posts
20
I'm a very light PC gamer so I'm just asking these questions in a more general sense :)

Basically I ran 3DMark06 using my 8500GT+8200 (recognised as 8600) onboard gfx under Geforce Boost mode. This is supposed to boost the general performance of the discrete card by 'up to 30%'. The 3DMark score was 2152 over all the tests it automatically selected.

I then rebooted and switched to 8500GT only and re-ran 3DMark. This time it performed the same tests yet the score was 2441. Although the performance was still very poor it was surprising considering that Geforce Boost was supposed to improve gfx performance, not hinder it.

Under Nvidia control panel with the 8600 it showed core clock at 500, with shader at 1200, but no option to tweak memory clock. With the 8500GT on it's own it shows a core clock of 450, shader at 900 and memory at 333. Even stranger then that the 3DMark score was higher with the seemingly weaker option of 8500GT on it's own.

What's going on? Would I just be better off using the 8500GT solo or is there something else at play that I'm missing. Also, the 8200 chipset is supposed to support Pixel Shader 4.0 but when it's in Geforce boost mode it only shows 3.0 capability, yet it retains most other specs like DX10 compatibility. Lastly, in Boost mode the gfx card is shown as 8600. Problem being that it isn't recognized by anything other than Nvidia products as far as tweaking or gaming. Rivatuner, for example, doesn't know what it is, and games report 'unrecognised card' and default to basic/medium settings.
 
Last edited:
Try OCing the 8500GT instead... both of the ones I've had could do over 60% increase on the core - scoring well over 3000 in 06.
 
Try OCing the 8500GT instead... both of the ones I've had could do over 60% increase on the core - scoring well over 3000 in 06.

I've been trying gradual overclocks with the card on it's own and did indeed get over 3000 with my last test with the GPU at 52°C.

I switched back to the dual setup and got only 2100-ish. I went back into the Nvidia control panel, then Performance. This time I noticed a drop-down box that says '8600 (8200+8500)', but also as a second option '8600 (8500+8200)', as if the second option prioritised the 8500 over the 8200. Changing to the second option opens up overclocking options for the 8500 instead of the ones for the 8200 but nothing seems to make any difference. I can't be 100% but it seems that Geforce Boost mode locks the clocks somehow.

For now I'm going to go back to the 8500GT on it's own because it does what I want it to, but I'm serious stumped/annoyed about the promised 'Boost' mode. Maybe it's just that 3DMark doesn't recognise the dual setup and defaults to the 8200, which might explain the low scores. As one last stab I'm going to remove the 8500GT and do a quick 8200-only test to see what it throws up.

/rant
 
the 8200 GPU seems to be only 40% of the performance of an 8500GT with hideously slow memory speed so I'd imagine most of the performance is lost due to the fact it just can't keep up. Couple it with an 8400 and you might see bigger gains over the addin card - but it would be slower than a 8500GT even then.

I'd go with OCing the 8500GT after unlinking the shader/core on mine I got over 760 on the core and around 1500 on the shaders iirc.

Mine had a nice fan/heatsink on them tho which meant that the additional overclocking made 0 difference in heat.

EDIT: Have a pic of one of them here actually: http://aten-hosted.com/images/DSC00264S.jpg
 
I don't trust the joint performance at all now. I've gone with your advice and stuck with the solo 8500GT. I managed to get 3099 in 3DMark with a decent overclock, almost a full 1000 higher than I used to have. Oblivion runs very nicely now :)
 
Back
Top Bottom