• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Mutli GPU solutions Worthless?

Associate
Joined
16 Nov 2006
Posts
753
Read this over at XS and it's empirical evidence that SLI/CS/X2 cards just don't work properly.

-----------------------

Microstuttering makes all multi GPU solutions to date, including 3870x2 and 9800GX2, WORTHLESS.

I did a quite bit research about this, with both my (G92 SLI) and a friend's (9800 Gx2) scores, five games benched.

Assassin's Creed and World in Conflict aren't affected by asynchronous frame rendering (microstutter)
Crysis, Lost Planet and Call of Juarez are ABSOLUTELY affected by asynchronous frame rendering.

I have written a few articles about this which caused some massive uproar and got me banned from several forums. Here are some frame-time benchmark results from Call of Juarez.

Frame number / The time the frame is rendered / How much time it took to render that frame (ms)/ Momentary FPS

48 753 10 103
49 769 16 61
50 784 15 67
51 790 6 174
52 814 24 41
53 832 19 54
54 838 6 178
55 859 21 47
56 877 18 57
57 881 4 235
58 906 25 40
59 921 15 65
60 928 7 142

What FRAPS timedemo would show you: 13 frames rendered in 175ms = 75 FPS

However the real fluidity is much more different than this. You are basically playing at 50-something FPS, with frames rendered at 150FPS being inserted every third frame. Those third frames with super high FPS mean NOTHING for your gaming experience. They do not make your game more fluid. They make micro-stutter. That scene-> you are playing at at most 55 FPS. What the benchies show you is that you are playing at 75 FPS.

This is why you should NEVER, EVER compare single-GPU scores with AFR'ed (SLI/CF) scores. NEVER. And this is also why ALL of the review sites are massively misleading. "A second 8800GT bumped our score from 40FPS to 65FPS!!" REALLY? But you forget to mention that 65 FPS is not comparable with that 40FPS, don't you? It wouldn't matter even if the scenario was like this, would it:

Frame 1 rendered at T = 0ms
Frame 2 rendered at T = 1ms
Frame 3 rendered at T = 40ms
Frame 4 rendered at T = 41ms
Frame 5 rendered at T = 60ms
Frame 6 rendered at T = 61ms

6 frames rendered at 61ms, so 100 FPS huh? Those 20ms gaps where you receive no frames from the cards, and put your real FPS at something like 50 FPS should be ignored because they don't show in the benchmark, and all you care about are the benchmarks.

I'm amazed at after this microstuttering situation broke out how all major review sites are ignoring this and still comparing AFR FPS's to non-AFR FPS's.

DO NOT BUY SLI, DO NOT BUY CROSSFIRE, DO NOT BUY 3870X2, DO NOT BUY 9800GX2, (PROBABLY) DO NOT BUY 4870x2. YOU ARE GAINING ABSOLUTELY NOTHING IN AT LEAST HALF OF ALL GAMES. YOU ARE PAYING TWICE FOR A 1.4X INCREASE IN REAL PERFORMANCE IN HALF OF GAMES. THAT DOES NOT MAKE SENSE.

-----------------
Thoughts? :D
 
Caporegime
Joined
18 Oct 2002
Posts
33,188
What frame numbers did you come up with, from a single card then? Did it ever occur to you that, because you chose three of the toughest, hardest performance games, that they can't always keep up max framerate, but if you compared the same numbers to a single card, you might get the same situation, but with the minimum and maximum numbers dramatically down? WHat exactly makes you instantly think this same issue doesn't occur with single cards. In fact, its fairly silly to claim that a single card gives a completely stable framerate all the time, it of course too will have at least some degree of ups and downs averaged into a framerate you see at the end.

If you want to do a mini review, fine, but you have no comparitive evidence here, you have one side of the story, and not the other.

Now frankly, theres many more things to this, some games are badly coded, some aren't so what, thats life, some will work better with sli/crossfire, some won't, anyone that didn't know that before buying a multi gpu setup should never have bought one in the first place.

Secondly, from testing a Nvidia card, its very very economical with the truth to SAY that ATi have the same problem, you can suppose, you can suspect, but you can't clinically say they do. Infact, if you did get a 3870x2, you'd have to make sure you fixed the 3dpower scaling so it didn't drop to 2dmode speeds at low stress points, as that does introduce a stutter of its own, my card before and after a bios flash to make all clocks the same was like night and day, a completely different card. Are you sure its not a driver problem (it is Nvidia after all) causing that?

In all likelyhood there is an issue and you have bought up some good points, but again very few people didn't know there are issues with sli/crossfire, its new tech and incredibly hard to implement. But you also claimed, with zero proof, to not get a 4780x2. Considering it doesn't use crossfire, at all(afaik) but a brand new gpu to gpu interconnect(hopefully running at much lower latency/higher speed, all indications are it is) which should, in theory help massively reduce the latency issues.

THe likely cause of the big gaps in new frames, is one core not having enough power to do its share of the work, and needing to talk to the other core to change the plan, across a fairly crude interface. With that largely eliminated, that could infact massively effect those large "lag/microstutter" issues........ if that is you could prove they exist and on ATi hardware too ;)


People would take you more seriously, with more qualified information, I get your point, and theres probably something to it, but you need more than a fairly arbitrary singular test to draw such sweeping conclusions as you have and claim them to be fact.
 
Associate
Joined
17 Jun 2008
Posts
325
CF/SLi does give you really visable increase in performance (in a lot of games anyway) , the game feels faster and can be played with higher settings. I have played games on single cards and CF/SLi with the same cards and i could really notice the difference in fps/ fluid playing of the game.

I do get what you are saying and i dont disagree with you but does the human eye really pick it up or would it start to be noticable at higher res?
 
Associate
Joined
3 Jul 2008
Posts
378
I have been saying this for the past 2 years.

Performance increases with multi-GPU solutions, currently, are something of an illusion. Yes there *may* be some overall improvement from adding the second card, but nowhere near what average-FPS benchmarks imply.

The r700 4870x2 *could* change this, with its high-bandwidth crossfire sideport, but I doubt it. In order to eliminate 'microstuttering' (I hate that phrase as it's so undescriptive) two things must happen:

1 - Both GPUs must work on the *same* frame, rather than working on alternate frames [essential]
2 - Both GPUs must access a common data store [required for efficient inter-communication and scalability]

Unfortunately, it does not look like the 4870x2 will do either of these, although the faster inter-GPU connections should allow for better performance scaling, which they haev already implied is the case.

This is the reason I went for a GTX280 instead of xfire 4870s.
 
Associate
Joined
3 Jul 2008
Posts
378
What frame numbers did you come up with, from a single card then? Did it ever occur to you that, because you chose three of the toughest, hardest performance games, that they can't always keep up max framerate, but if you compared the same numbers to a single card, you might get the same situation, but with the minimum and maximum numbers dramatically down?

I think you're missing the point. These aren't FPS values which are varying over the course of a benchmark with highly variable scenes - they are taken from consecutive frames, over the course of less than half a second. These dramatic variations are taking place at a frame-by-frame level. They are "noisy data" to use a scientific term. With a single GPU the results are smoothly varying, with a much smaller variance in the local-level fluctuations (usually less than 10%).


THe likely cause of the big gaps in new frames, is one core not having enough power to do its share of the work, and needing to talk to the other core to change the plan, across a fairly crude interface. With that largely eliminated, that could infact massively effect those large "lag/microstutter" issues........ if that is you could prove they exist and on ATi hardware too ;)


No - the problem is more fundamental.

* You have two GPUs each working on a separate frame.
* The frame is output as soon as it is completed.
* 'Smooth' performance requires an equal space of time between the output of these frames, which requires each GPU to pause for an appropriate length of time before beginning its render
* Not only would this reduce average FPS (and so the appearence of performance in average FPS benchmarks), but it is *impossible* to accurately predict this delay time. The delay is related to the frame-render time, and the only way to know this is to render the frame!

In short, the only way to avoid these issues is to have both GPUs work on the same frame. This does not give the same [apparent] performance increases as alternate frame rendering, so is not commonly done.
 
Last edited:
Associate
OP
Joined
16 Nov 2006
Posts
753
What frame numbers did you come up with, from a single card then? Did it ever occur to you that, because you chose three of the toughest, hardest performance games, that they can't always keep up max framerate, but if you compared the same numbers to a single card, you might get the same situation, but with the minimum and maximum numbers dramatically down? WHat exactly makes you instantly think this same issue doesn't occur with single cards. In fact, its fairly silly to claim that a single card gives a completely stable framerate all the time, it of course too will have at least some degree of ups and downs averaged into a framerate you see at the end.

If you want to do a mini review, fine, but you have no comparitive evidence here, you have one side of the story, and not the other.

Now frankly, theres many more things to this, some games are badly coded, some aren't so what, thats life, some will work better with sli/crossfire, some won't, anyone that didn't know that before buying a multi gpu setup should never have bought one in the first place.

Secondly, from testing a Nvidia card, its very very economical with the truth to SAY that ATi have the same problem, you can suppose, you can suspect, but you can't clinically say they do. Infact, if you did get a 3870x2, you'd have to make sure you fixed the 3dpower scaling so it didn't drop to 2dmode speeds at low stress points, as that does introduce a stutter of its own, my card before and after a bios flash to make all clocks the same was like night and day, a completely different card. Are you sure its not a driver problem (it is Nvidia after all) causing that?

In all likelyhood there is an issue and you have bought up some good points, but again very few people didn't know there are issues with sli/crossfire, its new tech and incredibly hard to implement. But you also claimed, with zero proof, to not get a 4780x2. Considering it doesn't use crossfire, at all(afaik) but a brand new gpu to gpu interconnect(hopefully running at much lower latency/higher speed, all indications are it is) which should, in theory help massively reduce the latency issues.

THe likely cause of the big gaps in new frames, is one core not having enough power to do its share of the work, and needing to talk to the other core to change the plan, across a fairly crude interface. With that largely eliminated, that could infact massively effect those large "lag/microstutter" issues........ if that is you could prove they exist and on ATi hardware too ;)


People would take you more seriously, with more qualified information, I get your point, and theres probably something to it, but you need more than a fairly arbitrary singular test to draw such sweeping conclusions as you have and claim them to be fact.

If you had the patience to read the first line of my post, you would know that I did not peform any of the tests. I just cut n paste a post from a guy over at XS forums.

Nvidia and Ati both use AFR to achieve the higher framerates, but at the cost of smoothness which the user has shown. 3 tough games are affected but 2 aren't affected of those sample of games he chose.

Why is it silly to say single card is smoother than an SLi/CF setup? If a single card can output 60fps (eg. GTX280) and a multigpu setup produces the same 60fps (eg 4850 CF), which do you think would be smoother?

Personally I think it's a driver problem, and the issue I have is that the all the review sites fail to mention this and the two powerhouses aren't doing enough to resolve this issue for all the people who laid down the cash to buy their products. They're just happy they made the sale, nevermind what the actual end user experience is.
 
Caporegime
Joined
20 Jan 2005
Posts
45,761
Location
Co Durham
Why is it silly to say single card is smoother than an SLi/CF setup? If a single card can output 60fps (eg. GTX280) and a multigpu setup produces the same 60fps (eg 4850 CF), which do you think would be smoother?

Although I agree with your statement, most educated people look at both average and minimum framerates before considering their purchase.

Miniumum framerates will take account of your results.

And your anology breaks down on cost though.

Here's another one for you.

If a single card can output 60fps with a minimum of 30fps and a multigpu setup produces 90fps with a miniumum of 50fps which would be smoother? Would you notice the difference? I would say most people would be happier with the multi gpu setup.
 
Soldato
Joined
22 Nov 2003
Posts
2,950
Location
Cardiff
Talk about over-analysis - who cares..

Here's the bottom line. I get a much smoother game play experience when I enable SLI or CF on either of my PC's than I do using a single card.

We can debate the value of the cost (x2 cost vs < x2 performance) but to call them worthless is just crazy.
 
Soldato
Joined
7 May 2006
Posts
12,183
Location
London, Ealing
Worthless? lol
Come play on my 2560x1600 at a silky smooth 60FPS Vsync & tell me that the first thing that hit you was microstutter-nowhere-to-be-seen ~& not the sheer beauty of such detail & size & girth.
 
Last edited:
Associate
Joined
3 Jul 2008
Posts
378
Personally I think it's a driver problem..

Unfortunately not. It's a lower-level issue, based on the hardware-level implementation.

I agree it is annoying that review sites rarely ever mention it, but by doing so they would be forced to acknowledge that their average-FPS benchmarks are not always an appropriate measure of performance. Only HardOCP have made the 'bold' leap towards dropping raw average-FPS numbers.
 
Associate
OP
Joined
16 Nov 2006
Posts
753
Although I agree with your statement, most educated people look at both average and minimum framerates before considering their purchase.

Miniumum framerates will take account of your results.

And your anology breaks down on cost though.

Here's another one for you.

If a single card can output 60fps with a minimum of 30fps and a multigpu setup produces 90fps with a miniumum of 50fps which would be smoother? Would you notice the difference? I would say most people would be happier with the multi gpu setup.
Thing is most multigpu solutions have lower minimum framerates if the average fps is the same or even higher and also depends on the game.

The cost for multigpu has changed in so much as people are getting better value now than say buying 2 8800 Ultras and SLi them.

Talk about over-analysis - who cares..

Here's the bottom line. I get a much smoother game play experience when I enable SLI or CF on either of my PC's than I do using a single card.

We can debate the value of the cost (x2 cost vs < x2 performance) but to call them worthless is just crazy.
Good for you...

I care, people are thinking about purchasing care and people with 20/20 vision care.
 
Associate
OP
Joined
16 Nov 2006
Posts
753
Worthless? lol
Come play on my 2560x1600 at a silky smooth 60FPS Vsync & tell me that the first thing that hit you was microstutter-nowhere-to-be-seen ~& not the sheer beauty of such detail & size & girth.

What game is that?

Vsync would default to 30 fps if the output is less than 60, and down to 12.5fps if less than 30. So it would be no good for the most difficult gaming engines, which let's face it, is what we bought the damn things for! :D
 
Last edited:
Associate
OP
Joined
16 Nov 2006
Posts
753
Two threads that go into further detail.

http://www.hardforum.com/showthread.php?t=1317582
http://forums.anandtech.com/messageview.aspx?catid=31&threadid=2178762

Here's a graph to show the timing of each frame, which leads to microstutter.

microstutter2ls7.png
 
Associate
Joined
3 Jul 2008
Posts
378
Worthless? lol
Come play on my 2560x1600 at a silky smooth 60FPS Vsync & tell me that the first thing that hit you was microstutter-nowhere-to-be-seen ~& not the sheer beauty of such detail & size & girth.

With vsync enabled, over the maximum refresh rate of the monitor, the entire issue of 'microstutter' dissapears. It is only an issue for framerates *below* the monitor refresh rate. In this instance, vsync can exaggerate the effect, as the differing frame-render times can cause a rapid switching between the different regimes (30fps / 60fps etc).

...thinking about it though, using triple buffer vsync (as I know you do) should provide a 'damping effect' to the microstutter. There will be more times when you will skip the output of a frame, but the inherent microstutter will disappear somewhat.
 
Associate
Joined
3 Jul 2008
Posts
378
What game is that?

Vsync would default to 30 fps if the output is less than 60, and down to 12.5fps if less than 30. So it would be no good for the most difficult gaming engines, which let's face it, is what we bought the damn things for! :D

Only if you use default vsync setting.

You should use triple buffering, to avoid any framerate drops :) click me


...Okay, no more pimping of my thread, but it just seems that many people are unaware that they can kill off tearing without losing their framerate!
 
Soldato
Joined
7 May 2006
Posts
12,183
Location
London, Ealing
What game is that?

Vsync would default to 30 fps if the output is less than 60, and down to 12.5fps if less than 30. So it would be no good for the most difficult gaming engines, which let's face it, is what we bought the damn things for! :D

No.
60/30/20/15...ect...but with triple buff on all it can do will do 45.. if i don't get 60 ect...
But the thing is it really does 60FPS because i hate tearing so i don't have to worry about the lower figures & with a single card at my res im pretty much stuck at 30FPS.
 
Back
Top Bottom