If we take the Arcadia level from BioShock as an example... running 1920x1200 with all settings maxed and 16x AF, no FSAA.
I just quickly ran around the level with rivatuner stats server active so these aren't strictly accurate but good enough to go by...
I don't have a 320Mb 8800 GTS to test but its going to struggle at 1920x1200 with max settings due to the lack of VRAM (340megs was in use on the arcadia level) so your going to get some swapping out of physical memory at a performance hit.
With the 640Mb 8800GTS the average framerate was around 25-30fps.
With an 8800GTX on stock you get around 40fps average, when clocked to (630Mhz) Ultra speeds the average went up to around 45-50fps.
As a side note my 7950GX2 was slightly better than the 8800GTS but only by a couple of FPS or so.
I just quickly ran around the level with rivatuner stats server active so these aren't strictly accurate but good enough to go by...
I don't have a 320Mb 8800 GTS to test but its going to struggle at 1920x1200 with max settings due to the lack of VRAM (340megs was in use on the arcadia level) so your going to get some swapping out of physical memory at a performance hit.
With the 640Mb 8800GTS the average framerate was around 25-30fps.
With an 8800GTX on stock you get around 40fps average, when clocked to (630Mhz) Ultra speeds the average went up to around 45-50fps.
As a side note my 7950GX2 was slightly better than the 8800GTS but only by a couple of FPS or so.
