Read this over at XS and it's empirical evidence that SLI/CS/X2 cards just don't work properly.
-----------------------
Microstuttering makes all multi GPU solutions to date, including 3870x2 and 9800GX2, WORTHLESS.
I did a quite bit research about this, with both my (G92 SLI) and a friend's (9800 Gx2) scores, five games benched.
Assassin's Creed and World in Conflict aren't affected by asynchronous frame rendering (microstutter)
Crysis, Lost Planet and Call of Juarez are ABSOLUTELY affected by asynchronous frame rendering.
I have written a few articles about this which caused some massive uproar and got me banned from several forums. Here are some frame-time benchmark results from Call of Juarez.
Frame number / The time the frame is rendered / How much time it took to render that frame (ms)/ Momentary FPS
48 753 10 103
49 769 16 61
50 784 15 67
51 790 6 174
52 814 24 41
53 832 19 54
54 838 6 178
55 859 21 47
56 877 18 57
57 881 4 235
58 906 25 40
59 921 15 65
60 928 7 142
What FRAPS timedemo would show you: 13 frames rendered in 175ms = 75 FPS
However the real fluidity is much more different than this. You are basically playing at 50-something FPS, with frames rendered at 150FPS being inserted every third frame. Those third frames with super high FPS mean NOTHING for your gaming experience. They do not make your game more fluid. They make micro-stutter. That scene-> you are playing at at most 55 FPS. What the benchies show you is that you are playing at 75 FPS.
This is why you should NEVER, EVER compare single-GPU scores with AFR'ed (SLI/CF) scores. NEVER. And this is also why ALL of the review sites are massively misleading. "A second 8800GT bumped our score from 40FPS to 65FPS!!" REALLY? But you forget to mention that 65 FPS is not comparable with that 40FPS, don't you? It wouldn't matter even if the scenario was like this, would it:
Frame 1 rendered at T = 0ms
Frame 2 rendered at T = 1ms
Frame 3 rendered at T = 40ms
Frame 4 rendered at T = 41ms
Frame 5 rendered at T = 60ms
Frame 6 rendered at T = 61ms
6 frames rendered at 61ms, so 100 FPS huh? Those 20ms gaps where you receive no frames from the cards, and put your real FPS at something like 50 FPS should be ignored because they don't show in the benchmark, and all you care about are the benchmarks.
I'm amazed at after this microstuttering situation broke out how all major review sites are ignoring this and still comparing AFR FPS's to non-AFR FPS's.
DO NOT BUY SLI, DO NOT BUY CROSSFIRE, DO NOT BUY 3870X2, DO NOT BUY 9800GX2, (PROBABLY) DO NOT BUY 4870x2. YOU ARE GAINING ABSOLUTELY NOTHING IN AT LEAST HALF OF ALL GAMES. YOU ARE PAYING TWICE FOR A 1.4X INCREASE IN REAL PERFORMANCE IN HALF OF GAMES. THAT DOES NOT MAKE SENSE.
-----------------
Thoughts?
-----------------------
Microstuttering makes all multi GPU solutions to date, including 3870x2 and 9800GX2, WORTHLESS.
I did a quite bit research about this, with both my (G92 SLI) and a friend's (9800 Gx2) scores, five games benched.
Assassin's Creed and World in Conflict aren't affected by asynchronous frame rendering (microstutter)
Crysis, Lost Planet and Call of Juarez are ABSOLUTELY affected by asynchronous frame rendering.
I have written a few articles about this which caused some massive uproar and got me banned from several forums. Here are some frame-time benchmark results from Call of Juarez.
Frame number / The time the frame is rendered / How much time it took to render that frame (ms)/ Momentary FPS
48 753 10 103
49 769 16 61
50 784 15 67
51 790 6 174
52 814 24 41
53 832 19 54
54 838 6 178
55 859 21 47
56 877 18 57
57 881 4 235
58 906 25 40
59 921 15 65
60 928 7 142
What FRAPS timedemo would show you: 13 frames rendered in 175ms = 75 FPS
However the real fluidity is much more different than this. You are basically playing at 50-something FPS, with frames rendered at 150FPS being inserted every third frame. Those third frames with super high FPS mean NOTHING for your gaming experience. They do not make your game more fluid. They make micro-stutter. That scene-> you are playing at at most 55 FPS. What the benchies show you is that you are playing at 75 FPS.
This is why you should NEVER, EVER compare single-GPU scores with AFR'ed (SLI/CF) scores. NEVER. And this is also why ALL of the review sites are massively misleading. "A second 8800GT bumped our score from 40FPS to 65FPS!!" REALLY? But you forget to mention that 65 FPS is not comparable with that 40FPS, don't you? It wouldn't matter even if the scenario was like this, would it:
Frame 1 rendered at T = 0ms
Frame 2 rendered at T = 1ms
Frame 3 rendered at T = 40ms
Frame 4 rendered at T = 41ms
Frame 5 rendered at T = 60ms
Frame 6 rendered at T = 61ms
6 frames rendered at 61ms, so 100 FPS huh? Those 20ms gaps where you receive no frames from the cards, and put your real FPS at something like 50 FPS should be ignored because they don't show in the benchmark, and all you care about are the benchmarks.
I'm amazed at after this microstuttering situation broke out how all major review sites are ignoring this and still comparing AFR FPS's to non-AFR FPS's.
DO NOT BUY SLI, DO NOT BUY CROSSFIRE, DO NOT BUY 3870X2, DO NOT BUY 9800GX2, (PROBABLY) DO NOT BUY 4870x2. YOU ARE GAINING ABSOLUTELY NOTHING IN AT LEAST HALF OF ALL GAMES. YOU ARE PAYING TWICE FOR A 1.4X INCREASE IN REAL PERFORMANCE IN HALF OF GAMES. THAT DOES NOT MAKE SENSE.
-----------------
Thoughts?