• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Micro-Stuttering And GPU Scaling In CrossFire And SLI - Tom's Hardware

Ha!

Finally a review site takes the time to look at this.

...Perhaps one day we may even see driver-level fixes from nvidia and AMD.
 
Last edited:
I'd say statistically there is a correlation between micro-stuttering and vram shortage as well; but probably I mean "stuttering" instead of "micro-stuttering".
 
Ha!

Finally a review site takes the time to look at this.

...Perhaps one day we may even see driver-level fixes from nvidia and AMD.

Also nice they backup claims I was making awhile ago - and took a lot of flack for - based on similiar data:

http://www.tomshardware.co.uk/radeon-geforce-stutter-crossfire,review-32256-5.html

Some good conclusions to. Also interesting that 3-way with higher mid-range cards can actually work out well but I suspect that reduction in microstutter comes at an increase cost in input lag.
 
Last edited:
Hopefully if more people make a noise about this, it'll persuade BOTH companies to focus more on the situation.

I mean, admittedly both companies have made big improvements over the last 4/5 gens of cards in terms of how obvious it is, but its still not a 'fixed' issue, if it ever CAN be fixed.
 
TBH I rarely if ever notice it on my GTX470 SLI setup aside from a small number of titles and/or the odd situation.

Tho I do tend to use the SLI setup to try and get as close to 120fps or higher for my 120Hz panel so its mitigated quite a bit.
 
hmm tri cards are better than daul for stuttering. wonder why.

Could be due to a lot off things one potentially being its CPU limited with that much rendering power which always reduces micro-stutter, also due to the way tri and quad rendering works you can be trading micro-stutter for increased input latency so its not as black and white as it seems and may not actually be a good thing.

The graphs would tend to indicate CPU limit being the factor in the tests they ran as the tri-card results are pretty much pegged at the peak dual card performance.
 
Last edited:
Very good & i always said 3 way was the sweet spot.

Another thing they should have done 3 way SLI because that may have reduced M/S like it did for CF.

And lastly that explains why i have not seen major M/S because i have been exclusively TriFire & Quadfire which gives minimum M/S & on top of that V-synced.
 
Last edited:
Could be due to a lot off things one potentially being its CPU limited with that much rendering power which always reduces micro-stutter, also due to the way tri and quad rendering works you can be trading micro-stutter for increased input latency so its not as black and white as it seems and may not actually be a good thing.

The graphs would tend to indicate CPU limit being the factor in the tests they ran as the tri-card results are pretty much pegged at the peak dual card performance.

Which means a slight overkill & leeway on the GPUs is a good thing.
 
Source: the comments on the review page

alangeering 24/08/2011 12:19
Hide
-0+

I'd like to first thank Igor and Greg for a very insightful article and for

discussing the not often talked about phenomenon of stuttering.

There's one thing I'd like to expand upon.

A few times in the article the observation is made that while dual GPU scaling is

good, the stuttering effect is bad.
No real point is made that when scaling is poor, stuttering is less pronounced.

It's precisely because three cards aren’t as efficient that stuttering is reduced.

Bear with me and I'll explain.


For the following thought experiment I've used the data from the Call of Juarez

graph on the page called "Step 2: Crossfire with three GPUs"

Three situations:
A: 1 card @ 70 fps average
B: 2 cards @ 135 fps average
c: 3 cards @ 160 fps average

In other words:
A: The card takes an average of 14.3 ms to produce the frame.
B: Each card has 14.8 ms to produce the frame to maintain the average.
C: Each card has 18.8 ms to produce the frame to maintain the average.

Look again at the data from Call of Juarez.
The lowest frame rate recorded for the single card is 60fps or 16.7 ms per frame.

This is well below the 14.8 ms required to not delay/stutter the pipeline in

situation B but...
This is well within the 18.8 ms time frame for the 3 card set up in situation C.

As frames are now arriving in time for use, the evidence of stuttering is reduced.

So efficiency is good; but inefficiency in scaling allows each card a little

longer to provide its frame, and the eventual combined frame rate is less

variable.

A quote from the article:
"This phenomenon manifests itself even more seriously in CoJ. While CrossFire

scales well under load, it becomes even more susceptible to micro-stuttering."

And another:
"For some reason, the third GPU almost always eliminates micro stuttering and has

a less-pronounced effect on performance."

You got so close; it just needed another jump of statistical thinking. Efficiency

correlates with stuttering (NVIDIA and AMD) and there is a logical reason why.
alangeering 24/08/2011 12:37
Hide
-0+

The above post isn't trying to explain why microstuttering occurs - only why it's more pronounced as multi-gpu scaling increases. (and less so as scaling efficiency decreases)

And on top of that the scaling on the ATI 5xxx is less efficient than the AMD 6xxx.
 
Thanks nice read, also as we see it still is a problem and why I still avoid crossfire and SLI and always buy the top card at the time I think is worth buying. Seems I will never bother with SLI or crossfire till they cure these problems 100%. It would drive me nuts any form of stuttering.
 
And on top of that the scaling on the ATI 5xxx is less efficient than the AMD 6xxx.

I thought that was known already.

I can't say I have noticed any micro-stuttering when using Crossfire/SLI. What constitutes a micro-stutter as opposed to just a normal framerate dip as you would get with a single card?
 
I can't say I have noticed any micro-stuttering when using Crossfire/SLI. What constitutes a micro-stutter as opposed to just a normal framerate dip as you would get with a single card?

Microstutter is badly named: really it's just irregular framerate output. Unless you're running at low framerates (< 30 fps) then it doesn't appear as a stutter at all. It simply makes your game seem less smooth than you would expect from the framerate. For example, suppose you were playing at 40fps with a single GPU card; to get the same in-game smoothness with a multi-GPU setup experiencing moderate amounts of microstutter, you might need 50fps. So it's not something you notice by just looking at your game - in essence it silently eats away performance :p

I suppose I might as well post my microstutter summary at this point:

* Microstutter is the name given to irregular frame output. With two or more GPUs working in the "alternate frame rendering" (AFR) mode that both Nvidia and AMD use, frames are not always output evenly. You can get one frame, then a very short gap until the next frame, then a longer gap until the next one, and so on.

* This effect makes the game seem less smooth for a given framerate than if the frames were output evenly.

* Unless the framerate is very low, so that you can see individual frames, you won't be able to look at a game scene and say "hey this is microstuttering!" - it simply makes your game scene appear less smooth for a given framerate. Or, conversely, you need a slightly higher framerate for the game to seem as smooth as with a single GPU. This is the main reason for so much misunderstanding about microstutter.

* The amount of microstutter can vary significantly from game to game. But, in most circumstances that it occurs, you're looking at an effective reduction in smoothness equivalent to around 10-25% in comparison to a regular frame output (see the thread I linked to for quantitative details).

* Bear in mind that the amount of performance you will gain from adding a second card will almost always be much larger than the 10-25% effective drop from microstutter. In almost every case, you DO get an improvement in smoothness from adding a second card. So, it's well worthwhile.

* ...The real question comes when considering dual-GPU setups of low-end cards when a single high-end card can offer similar performance. In these cases, the effectove value of the dual-card setup is somewhat less than benchmarks may lead you to believe (since they only measure the raw number of frames output and take no account of "smoothness" effects caused by irregular frame output). In these cases it is often better to consider a single higher-end GPU.

* Microstutter disappears almost entirely whenever the GPU is made to wait between frames. In these circumstances the output from the GPUs syncs up to the regular output of whatever is holding them back. The most common two circumstances where this occurs are: 1) When vsync is enabled, 2) When the CPU is limiting the framerate (again, see that thread for more quantitativ details).

For more detail, and a program to measure the amount of microstutter in a FRAPS benchmark, see this thread.
 
I thought that was known already.

Yes its know in the context of being worse but in light of the findings its a good thing.
So if building right know & your able to pick up some 5xxx, 2GB versions preferable which maybe a better idea & don't worry about someone saying no get the 6xxx they have better scaling.
 
Last edited:
Back
Top Bottom