• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Mutli GPU solutions Worthless?

Associate
Joined
16 Nov 2006
Posts
753
Read this over at XS and it's empirical evidence that SLI/CS/X2 cards just don't work properly.

-----------------------

Microstuttering makes all multi GPU solutions to date, including 3870x2 and 9800GX2, WORTHLESS.

I did a quite bit research about this, with both my (G92 SLI) and a friend's (9800 Gx2) scores, five games benched.

Assassin's Creed and World in Conflict aren't affected by asynchronous frame rendering (microstutter)
Crysis, Lost Planet and Call of Juarez are ABSOLUTELY affected by asynchronous frame rendering.

I have written a few articles about this which caused some massive uproar and got me banned from several forums. Here are some frame-time benchmark results from Call of Juarez.

Frame number / The time the frame is rendered / How much time it took to render that frame (ms)/ Momentary FPS

48 753 10 103
49 769 16 61
50 784 15 67
51 790 6 174
52 814 24 41
53 832 19 54
54 838 6 178
55 859 21 47
56 877 18 57
57 881 4 235
58 906 25 40
59 921 15 65
60 928 7 142

What FRAPS timedemo would show you: 13 frames rendered in 175ms = 75 FPS

However the real fluidity is much more different than this. You are basically playing at 50-something FPS, with frames rendered at 150FPS being inserted every third frame. Those third frames with super high FPS mean NOTHING for your gaming experience. They do not make your game more fluid. They make micro-stutter. That scene-> you are playing at at most 55 FPS. What the benchies show you is that you are playing at 75 FPS.

This is why you should NEVER, EVER compare single-GPU scores with AFR'ed (SLI/CF) scores. NEVER. And this is also why ALL of the review sites are massively misleading. "A second 8800GT bumped our score from 40FPS to 65FPS!!" REALLY? But you forget to mention that 65 FPS is not comparable with that 40FPS, don't you? It wouldn't matter even if the scenario was like this, would it:

Frame 1 rendered at T = 0ms
Frame 2 rendered at T = 1ms
Frame 3 rendered at T = 40ms
Frame 4 rendered at T = 41ms
Frame 5 rendered at T = 60ms
Frame 6 rendered at T = 61ms

6 frames rendered at 61ms, so 100 FPS huh? Those 20ms gaps where you receive no frames from the cards, and put your real FPS at something like 50 FPS should be ignored because they don't show in the benchmark, and all you care about are the benchmarks.

I'm amazed at after this microstuttering situation broke out how all major review sites are ignoring this and still comparing AFR FPS's to non-AFR FPS's.

DO NOT BUY SLI, DO NOT BUY CROSSFIRE, DO NOT BUY 3870X2, DO NOT BUY 9800GX2, (PROBABLY) DO NOT BUY 4870x2. YOU ARE GAINING ABSOLUTELY NOTHING IN AT LEAST HALF OF ALL GAMES. YOU ARE PAYING TWICE FOR A 1.4X INCREASE IN REAL PERFORMANCE IN HALF OF GAMES. THAT DOES NOT MAKE SENSE.

-----------------
Thoughts? :D
 
What frame numbers did you come up with, from a single card then? Did it ever occur to you that, because you chose three of the toughest, hardest performance games, that they can't always keep up max framerate, but if you compared the same numbers to a single card, you might get the same situation, but with the minimum and maximum numbers dramatically down? WHat exactly makes you instantly think this same issue doesn't occur with single cards. In fact, its fairly silly to claim that a single card gives a completely stable framerate all the time, it of course too will have at least some degree of ups and downs averaged into a framerate you see at the end.

If you want to do a mini review, fine, but you have no comparitive evidence here, you have one side of the story, and not the other.

Now frankly, theres many more things to this, some games are badly coded, some aren't so what, thats life, some will work better with sli/crossfire, some won't, anyone that didn't know that before buying a multi gpu setup should never have bought one in the first place.

Secondly, from testing a Nvidia card, its very very economical with the truth to SAY that ATi have the same problem, you can suppose, you can suspect, but you can't clinically say they do. Infact, if you did get a 3870x2, you'd have to make sure you fixed the 3dpower scaling so it didn't drop to 2dmode speeds at low stress points, as that does introduce a stutter of its own, my card before and after a bios flash to make all clocks the same was like night and day, a completely different card. Are you sure its not a driver problem (it is Nvidia after all) causing that?

In all likelyhood there is an issue and you have bought up some good points, but again very few people didn't know there are issues with sli/crossfire, its new tech and incredibly hard to implement. But you also claimed, with zero proof, to not get a 4780x2. Considering it doesn't use crossfire, at all(afaik) but a brand new gpu to gpu interconnect(hopefully running at much lower latency/higher speed, all indications are it is) which should, in theory help massively reduce the latency issues.

THe likely cause of the big gaps in new frames, is one core not having enough power to do its share of the work, and needing to talk to the other core to change the plan, across a fairly crude interface. With that largely eliminated, that could infact massively effect those large "lag/microstutter" issues........ if that is you could prove they exist and on ATi hardware too ;)


People would take you more seriously, with more qualified information, I get your point, and theres probably something to it, but you need more than a fairly arbitrary singular test to draw such sweeping conclusions as you have and claim them to be fact.

If you had the patience to read the first line of my post, you would know that I did not peform any of the tests. I just cut n paste a post from a guy over at XS forums.

Nvidia and Ati both use AFR to achieve the higher framerates, but at the cost of smoothness which the user has shown. 3 tough games are affected but 2 aren't affected of those sample of games he chose.

Why is it silly to say single card is smoother than an SLi/CF setup? If a single card can output 60fps (eg. GTX280) and a multigpu setup produces the same 60fps (eg 4850 CF), which do you think would be smoother?

Personally I think it's a driver problem, and the issue I have is that the all the review sites fail to mention this and the two powerhouses aren't doing enough to resolve this issue for all the people who laid down the cash to buy their products. They're just happy they made the sale, nevermind what the actual end user experience is.
 
Although I agree with your statement, most educated people look at both average and minimum framerates before considering their purchase.

Miniumum framerates will take account of your results.

And your anology breaks down on cost though.

Here's another one for you.

If a single card can output 60fps with a minimum of 30fps and a multigpu setup produces 90fps with a miniumum of 50fps which would be smoother? Would you notice the difference? I would say most people would be happier with the multi gpu setup.
Thing is most multigpu solutions have lower minimum framerates if the average fps is the same or even higher and also depends on the game.

The cost for multigpu has changed in so much as people are getting better value now than say buying 2 8800 Ultras and SLi them.

Talk about over-analysis - who cares..

Here's the bottom line. I get a much smoother game play experience when I enable SLI or CF on either of my PC's than I do using a single card.

We can debate the value of the cost (x2 cost vs < x2 performance) but to call them worthless is just crazy.
Good for you...

I care, people are thinking about purchasing care and people with 20/20 vision care.
 
Worthless? lol
Come play on my 2560x1600 at a silky smooth 60FPS Vsync & tell me that the first thing that hit you was microstutter-nowhere-to-be-seen ~& not the sheer beauty of such detail & size & girth.

What game is that?

Vsync would default to 30 fps if the output is less than 60, and down to 12.5fps if less than 30. So it would be no good for the most difficult gaming engines, which let's face it, is what we bought the damn things for! :D
 
Last edited:
Two threads that go into further detail.

http://www.hardforum.com/showthread.php?t=1317582
http://forums.anandtech.com/messageview.aspx?catid=31&threadid=2178762

Here's a graph to show the timing of each frame, which leads to microstutter.

microstutter2ls7.png
 
[ui]ICEMAN;12063025 said:
From all my time using multi GPU setups, even back to my old Voodoo 2 12mb SLI setup I've never encountered micro stutter. I used to work as a hardware reviewer as many of you know and even then, with pre release tech never had an issue.

I've been running a 30" 2560x1600 screen for the last 3/4 years now which pretty much means I have to rely on multi GPU setups to give me the performance I require.

Recently, before my 3 GTX280's I owned 2 9800gx2's and I had no end of issues with these cards, the micro stutter rendered quad SLI unusable for me on my system and even with the one card in, I still suffered from it a little. Many others got away with it just fine, no micro stutter but it does exist, it's not a myth.

However, out of the 10+ SLI/Xfire setups I own and the hundreds I have tested/reviewed, I've only experienced this problem on one set of cards, it is not as common as some people will say and generally is not a reason for people to be scared of going multi GPU. The chances of experiencing it are greater with the more cards you add but my 3 GTX280's don't suffer from it at all. Be aware that it *can* happen but also don't be put off by such a small number of people that suffer from it on 2 card configurations.
You never encountered microstutter? Then you contradict yourself?!

Microstutter is a problem and it is out there. Personally speaking SLi/CF not worthless, but and it's a big but, the cons outweigh the merits and if it was as good as everyone says it is. I would have already jumped on that boat.

I've had a 30" monitor for over 2 years so saying you need multigpu solutions is frankly ridiculous, I've used the x1800xt, x1900xt, then 8800gtx, 8800gt, 8800gts and soon the 4870. While potentially underpowered these cards ran 90% of the games I played no problem. Only one that's having issues is Crysis, and AOC to a lesser extent (but I uninstalled that so no worries :D)

The people who are proponents of SLi either have it already or have invested too much in it not to work out. When I had SLi I wanted it to work properly so bad, but it just didn't deliver and when it did, it was stuttery.
 
Last edited:
[ui]ICEMAN;12064099 said:
I'm not contradicting myself at all. If you read my post you'll see that I was talking about my other cards *PRIOR* to the 9800GX2's.

I'm sorry but to you an x1800xt might cut it, for me, until this latest generation of GPU's, I have not had acceptable performance at 2560x1600 in many of the latest titles.

I have no need to justify my dual GPU purchases due to money invested, for me it is necessary to enjoy what I do to the fullest. There is no denying SLI/Xfire offers more performance *most* of the time, however it is not cost efficient for most people so perhaps it is you who feels the need to justify why you won't spend that much money on a dual or multi GPU solution.

Also before you start writing anything, there's nothing wrong with not wanting to spend that much money on cards at all, its entirely your own choice and I would agree with you ; multi GPU solutions are most definitely not cost effective! However it's also my choice that I decide to spend $2100 on graphics cards, not yours and to me, it *is* worth it.
I think if you read it again properly you'll find you do contradict yourself.
"From all my time using multi GPU setups, even back to my old Voodoo 2 12mb SLI setup I've never encountered micro stutter."

I didn't say an x1800xt would cut it for me now, but back then when games weren't so intensive and that was the best on the market it would cut it.

You have no need to justify you GPU purchases, that's fine and dandy. More money than sense some would say. I do feel the need to justify purchases as would the majority of people, if there's no discernable benefit then why bother right? I mean the fact that nvidia have driven up prices far too high haven't been because of people like you who buy tri sli 280 when the prices were sky high. Thanks goodness for ati ;)
 
...
How about this - tonight I will look into the best and most reliable way of recording frame output times. I'll write up a mini-guide on how to do it, and make a thread. Users on this forum with both single- and muli-GPU setups can then report back their results in a variety of games, and we will be able to quantify the magnitude of this effect. Sound good?

That sounds fair. Though I'm willing to bet numerous people will be surprised at their findings. At least Fraps is objective, people looking at the screen for something they don't want to see it is subjective and a bit of psychology is also in play. It very much reminds me of the rainbow effect with single dlp projectors, some people see it, some people don't.
 
Well the great Finnish overclocker Sampsa has his 4870x2, here is his post (copied from XS)

-----------------

Good news! I can confirm that based on my own tests microstuttering is gone on R700!

I've tested with R700 (ATI Radeon HD 4870 X2) and R680 (ATI Radeon HD 3870 X2) in Crysis (1600x1200 and High settings). I used Fraps and enabled Frametimes logging. I recorded 2 seconds from exactly the same point in game (loaded from save game). Based on my recorded data, with ATI Radeon HD 3870 X2 frames are rendered after ~21,5 and every other frame after ~49,5 ms. With ATI Radeon HD 4870 X2 all framres are rendered after ~ 21,9 ms.

-------------------

He notices the discrepancy too, but fortunately it seems to be gone from the 4870X2 :)
 
Back
Top Bottom