I'm a very light PC gamer so I'm just asking these questions in a more general sense 
Basically I ran 3DMark06 using my 8500GT+8200 (recognised as 8600) onboard gfx under Geforce Boost mode. This is supposed to boost the general performance of the discrete card by 'up to 30%'. The 3DMark score was 2152 over all the tests it automatically selected.
I then rebooted and switched to 8500GT only and re-ran 3DMark. This time it performed the same tests yet the score was 2441. Although the performance was still very poor it was surprising considering that Geforce Boost was supposed to improve gfx performance, not hinder it.
Under Nvidia control panel with the 8600 it showed core clock at 500, with shader at 1200, but no option to tweak memory clock. With the 8500GT on it's own it shows a core clock of 450, shader at 900 and memory at 333. Even stranger then that the 3DMark score was higher with the seemingly weaker option of 8500GT on it's own.
What's going on? Would I just be better off using the 8500GT solo or is there something else at play that I'm missing. Also, the 8200 chipset is supposed to support Pixel Shader 4.0 but when it's in Geforce boost mode it only shows 3.0 capability, yet it retains most other specs like DX10 compatibility. Lastly, in Boost mode the gfx card is shown as 8600. Problem being that it isn't recognized by anything other than Nvidia products as far as tweaking or gaming. Rivatuner, for example, doesn't know what it is, and games report 'unrecognised card' and default to basic/medium settings.

Basically I ran 3DMark06 using my 8500GT+8200 (recognised as 8600) onboard gfx under Geforce Boost mode. This is supposed to boost the general performance of the discrete card by 'up to 30%'. The 3DMark score was 2152 over all the tests it automatically selected.
I then rebooted and switched to 8500GT only and re-ran 3DMark. This time it performed the same tests yet the score was 2441. Although the performance was still very poor it was surprising considering that Geforce Boost was supposed to improve gfx performance, not hinder it.
Under Nvidia control panel with the 8600 it showed core clock at 500, with shader at 1200, but no option to tweak memory clock. With the 8500GT on it's own it shows a core clock of 450, shader at 900 and memory at 333. Even stranger then that the 3DMark score was higher with the seemingly weaker option of 8500GT on it's own.
What's going on? Would I just be better off using the 8500GT solo or is there something else at play that I'm missing. Also, the 8200 chipset is supposed to support Pixel Shader 4.0 but when it's in Geforce boost mode it only shows 3.0 capability, yet it retains most other specs like DX10 compatibility. Lastly, in Boost mode the gfx card is shown as 8600. Problem being that it isn't recognized by anything other than Nvidia products as far as tweaking or gaming. Rivatuner, for example, doesn't know what it is, and games report 'unrecognised card' and default to basic/medium settings.
Last edited: