• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Mutli GPU solutions Worthless?

I have had microstutter and get this it was a terrible problem and the best of it was the card was a single card HD2900XT and it was terrible.
Now on tri xfirfe 3850 and a 3870x2 on equal clocks, never had a problem.
Cant wait for the 4870x2 though, as I will be adding a 4870 to it as tri xfire seems to work great for me.

I was gonna add a 30" and single cards dont add up to me, must play some rubbish games, certainly not the most stressful ones that have been the best in the peroid, wonder if Far Cry was out around that time, if you you was playing that rez with that card smooth, your having a laugh.
 
Last edited:
I have had microstutter and get this it was a terrible problem and the best of it was the card was a single card HD2900XT and it was terrible.

This is why I hate the term "microstutter" - it's so misleading.

"Microstutter" is caused by uneven frame output, which is a uniquely multi-GPU phenomenon [see post#15 by Tdream]

Now, when most people hear the word "microstutter" they think of very small pauses in gameplay. It is, after all, what the name suggests... However this is not "microstutter", it is in fact GPU-RAM to system-RAM paging.
 
This is why I hate the term 'microstutter' - it's so misleading.

"Microstutter" is caused by uneven frame output, which is a uniquely multi-GPU phenomenon [see post#15 by Tdream]

Now, when most people hear the word 'microstutter' they think of very small pauses in gameplay. It is, after all, what the name suggests... However this is not "microstutter", it is in fact GPU-RAM to system-RAM paging.

I have been saying for a long time that people get confused by the 2 & that
GPU-RAM to system-RAM paging will of coarse still happen on multi GPU as well.
 
So, waiting out for this 4870X2 is a waste of time?

I wouldn't say that... It's likely to be a beast of a card, and should offer better overall performance than the GTX280 in my opinion, even taking into account the 'microstutter' issue. If nothing else, it should be faster than crossfire 1Gb 4870s.

I was hopeful that the particular implementation (using the crossfire sideport) could eliminate the problem entirely, but it looks like they are still going for a parallel memory method, and alternate frame rendering, which will still lead to microstutter. It's possible that the effect could be reduced through somewhat through some clever inter-communication between the GPUs, but any way you look at it this will reduce raw FPS and so hit the appearence of superiority over a crossfire setup.

But for me, learning that the 4870x2 would still in effect behave as a traditional dual-GPU card was enough to push me to a GTX280.
 
I would have thought that in todays systems multi core cpu's are also exaggerating the issue, as not only could the multi gpu's be rendering frames unevenly but the cores feeding them the data could be out of balance too.

Looks like its going to be an inherent issue on pc's, and probably only fully cureable on a full intergrated system with a shared memory/resource pool - more like the consoles. Well, perhaps not the XBox/360, but at least the older propriatory designs. Pc's seem to try and get over most things with pure grunt.

If that makes any sense? :D
 
I think I'd go by the testimonies of those who actually have multi-GPU setups rather than some graphs which show why they theoretically could be worthless. IMO the only thing worthless is the theory if most people seem to be happy with the performance boost of two cards.
 
just installed crossfire 4850's, my first multi gpu rig, i cant say ive noticed what i would call microstuttering. I heard it was most noticible on racing games but GRID is awesome and silky smooth.

The only thing ive noticed seems to be lower min fps, prolly the driver overhead showing.
 
I wouldn't say that... It's likely to be a beast of a card, and should offer better overall performance than the GTX280 in my opinion, even taking into account the 'microstutter' issue. If nothing else, it should be faster than crossfire 1Gb 4870s.

I was hopeful that the particular implementation (using the crossfire sideport) could eliminate the problem entirely, but it looks like they are still going for a parallel memory method, and alternate frame rendering, which will still lead to microstutter. It's possible that the effect could be reduced through somewhat through some clever inter-communication between the GPUs, but any way you look at it this will reduce raw FPS and so hit the appearence of superiority over a crossfire setup.

But for me, learning that the 4870x2 would still in effect behave as a traditional dual-GPU card was enough to push me to a GTX280.

You make it sound like all multi-GPU platforms suffer from this, which they plainly do not.
 
That was the point of my post, this does *not* affect all multi GPU platforms, it does have the potential to however. I've been fine in 10+ SLI/Xfire setups barring one issue with the 9800GX's that rendered Quad SLI useless for me. Other than that, I've always had great success with multi GPU setups and would recommend them to anyone who wants more performance and has the money to justify it.
 
And I guess you sit on your old pentium 4 OCed to 6ghz because it's better than your 3ghz dual core.

You think why they make multi-cpus, multi-channel memory, multi gpus and multi screens??

Single GPUs just dont cut it anymore at some points. And they can't just make 2x faster card than lets say 4870 straight away.
 
I've been looking into 'microstuttering' for a while now, since weighing up my last upgrade. My opinion based on what information is available on the net so far, is that it's FUD. OP has linked to forums which link or paste info from the same german article that absolutely every other mention of microstuttering on the net seems to reference. An article on pcgameshardware.de, from February.

I've searched long and hard for some better analysis of this phenomenon, and every mention of it, pretty much all of which are from forum posts, either link back to this same article, or link to a forum which in turn links back to the article. There's only 2 sets of test results that exist backing up this idea, one has already been pasted in this thread, the graph compiled from results on another german hardware site. The other is very artificial looking perfectly straight line versus uniformly wriggly line graph in the article on the first german hardware site.

It strikes me as quite odd that since this is such a simple problem to understand and apparently very easy to demonstrate with a copy of fraps, why every forum thread discussing this is just linking the same results over and over. I've looked through the threads, and the only pasted results from a user in the thread did not backup the original microstuttering claims at all. The hardforum.com post links a graph that a user 'found' - found on the previously mentioned german article. Where's the Anandtech or similar respected hardware site investigation into this, if this is a genuine issue on multigpu systems?

Have another look at those call of juarez figures on the previous page. The numbers show at what time in ms the GPU(s) rendered that particular frame. Note that the multigpu configuration in this graph is actually putting out a lower framerate than the single GPU setup... It takes 1.2 seconds to reach 30 frames, versus 1 second for the single GPU. I can tell you from my best attempt at german that these figures are allegedly 1 8800 Ultra vs 3 8800 Ultra's, the fastest cards available at the time of the article. If that doesn't start ringing alarm bells as to the validity of the benchmark, take a look at the variance of the single GPU results. It's too perfect, they look nothing like the results pasted by users on the other forums, and it's putting out almost bang on 30 fps. I'd take a wild stab in the dark and say it's running at a framerate cap.

The original article describes microstuttering as an inherant problem with all AFR multigpu solutions, and though they only have 1 questionable graph of 1 card in 1 game, they are happy to say that this is a problem with all hardware in all games and has no solution. It's safe to say just from the responses in this thread, that this is a load of FUD.

As far as i can tell, the only reason this article is still being propagated is because it's far too easy to associate the word 'stuttering' with absolutely any type of game related performance problem you may have. So anyone having any type of performance problem on multigpu who reads the article suddenly has their answer. Low on video memory and getting little pauses as textures are loaded from ram? Microstuttering! Low on ram and paging from disk? Microstuttering! Framerate under 60 in crysis? Woa hang on you're using SLI, that must be microstuttering. This is why you get things like this: http://www.youtube.com/watch?v=-5SXeR0torc pasted as an example of the phenomenon, when the slightest bit of thought would tell you that short noticable pauses every couple of seconds visible in a youtube video cannot possibly be the microstutter issue the german articles are talking about.

That said, even after seeing it in action running very nicely, i'm not going for a crossfire setup. Nothing to do with 'microstuttering' i just like my double monitors, multiple GPUs have many issues, but i'm not at all convinced that this is one of them.
 
TeoH
To add that the new PowerPlay tech can have issues at times with the GPU clocks dropping & rising rapidly on some games which also gets confused with microstuttering but get fixed with driver updates.
 
You make it sound like all multi-GPU platforms suffer from this, which they plainly do not.

When running in alternate frame rendering mode, they do - to a greater or lesser extent. It's a hardware implementation based phenomenon. Very low-level.

Like I said earlier in the thread though, enabling vsync or a moderate framerate cap will circumvent the problem somewhat.



edit - I agree that the available data is lacking. I would gladly produce some myself had I access to an SLI or x-fire setup. I wish more reputable review sites would investigate the phenomenon. Perhaps we could petition anandtech or [H] to write an article?
 
Last edited:
When running in alternate frame rendering mode, they do - to a greater or lesser extent. It's a hardware implementation based phenomenon. Very low-level.

Like I said earlier in the thread though, enabling vsync or a moderate framerate cap will circumvent the problem somewhat.



edit - I agree that the available data is lacking. I would gladly produce some myself had I access to an SLI or x-fire setup. I wish more reputable review sites would investigate the phenomenon. Perhaps we could petition anandtech or [H] to write an article?

Sorry, but neither of my rigs suffer from it (nor any I have built in the past - at least 4 PC's I can think of), with or without vsync. Are you saying it is there and I just can't see it?

Are you basing your arguments on the data posted in this thread or from elsewhere, because as TeoH has pointed out, the sources are dubious at best.
 
I've been banging on about this problem for a long time. Aside from running Voodoo 2s I first tried SLI properly with 2x 7800GTXs a few years back. Apart from the extra power, heat and noise issues you get running two cards, I got microstuttering. At the time I tried to fix it, waited for drivers etc.. I tried SFR but SFR seems to be slower and you get tearing. In the end I gave up trying and dropped to a single card.

Over the years I've seen (but not owned) a number of crossfire and SLI setups and haven't seen a single one which doesn't exhibit microstuttering when stressed.

The best way I can describe the difference between microstuttering and a normal drop in frame rate is that microstuttering feels like your computer has suddenly and briefly lost contact with your game controller or mouse. For me, at least, this is really annoying.

Regardless of microstuttering the other issue with SLI (as pointed out by others) is min FPS. I don't care about average framerate or max framerate, all I care about is the minimum. It is common to see SLI & crossfire systems dropping to single card (or even lower) speeds at certain points on many games.. so why on earth would I want more than a single card?

As another poster said, the 4870x2 sounds promising but given it's architecture and the nature of AFR I don't see how it will remove microstuttering, there is still no way to know how long the next frame will take to render without rendering it. However I guess it's possible it might improve the issue with min FPS.
 
Sorry, but neither of my rigs suffer from it (nor any I have built in the past - at least 4 PC's I can think of), with or without vsync. Are you saying it is there and I just can't see it?

Are you basing your arguments on the data posted in this thread or from elsewhere, because as TeoH has pointed out, the sources are dubious at best.

I'm not basing my arguments on any data. I'm not convinced much of what has been posted is reliable either.

I'm basing my arguments on the mode of operation of the hardware itself. I like to operate from a low-level understanding of hardware implementation methods (I'm just a geek that way...). The 'what' follows directly from the 'how' and the 'why. That's not to say I wouldn't like more data from reliable sources to quantify the effect, though.

I've used two SLI and one x-fire rig in the past, and noticed the effect on all three - the first SLI rig was before I investigated multi-GPU implementation methods in depth, so the reason for it it was a mystery at the time.

As for your rigs - I would wager that it's there, but that you're not noticing it. It's a very subtle effect, as it happens on timescales too small for us to see directly. We only notice the effect indirectly, as requiring more raw frames-per-second to fool the eye (and brain) into thinking it is witnessing a moving image, rather than a series of static images. If you have played a multi-GPU rig for a while and switch to a single GPU rig which achieves roughly the same framerate, you will be surprised at how much smoother it 'feels'.


How about this - tonight I will look into the best and most reliable way of recording frame output times. I'll write up a mini-guide on how to do it, and make a thread. Users on this forum with both single- and muli-GPU setups can then report back their results in a variety of games, and we will be able to quantify the magnitude of this effect. Sound good?
 
Back
Top Bottom