Microstutter is not something that you will "notice" just by looking at the screen, unless the framerate is already low (low enough that you can catch the irregularity in frame output), but that doesn't mean it has no effect. When the framerate is higher, the effect of irregular frame output is an apparent reduction in smoothness (since the key factor in fooling the eye into believing a series of pictures are a fluid moving image is the maximum gap between frames). The end-product effect is similar to just dropping the framerate a little.
Anyway, I've written loads of stuff on these forums and others about microstutter (
see this thread) if you want to search for it. I also wrote a program to quantify the amount of microstutter in a benchmark taken with FRAPS, so you can find out for yourself easily enough. But anyway, in short:
- Microstutter is the name given to irregular frame output. With two or more GPUs working in the "alternate frame rendering" (AFR) mode that both Nvidia and AMD use, frames are not always output evenly. You can get one frame, then a very short gap until the next frame, then a longer gap until the next one, and so on.
* This effect makes the game seem less smooth for a given framerate than if the frames were output evenly.
* Unless the framerate is very low, so that you can see individual frames, you won't be able to look at a game scene and say "hey this is microstuttering!" - it simply makes your game scene appear less smooth for a given framerate. Or, conversely, you need a slightly higher framerate for the game to seem as smooth as with a single GPU. This is the main reason for so much misunderstanding about microstutter.
* The amount of microstutter can vary significantly from game to game. But, in most circumstances that it occurs, you're looking at an effective reduction in smoothness equivalent to around 10-25% in comparison to a regular frame output (see the thread I linked to for quantitative details).
* Bear in mind that the amount of performance you will gain from adding a second card will almost always be much larger than the 10-25% effective drop from microstutter. In almost every case, you DO get an improvement in smoothness from adding a second card. So, it's well worthwhile.
* ...The real question comes when considering dual-GPU setups of low-end cards when a single high-end card can offer similar performance. In these cases, the effectove value of the dual-card setup is somewhat less than benchmarks may lead you to believe (since they only measure the raw number of frames output and take no account of "smoothness" effects caused by irregular frame output). In these cases it is often better to consider a single higher-end GPU.
* Microstutter disappears almost entirely whenever the GPU is made to wait between frames. In these circumstances the output from the GPUs syncs up to the regular output of whatever is holding them back. The most common two circumstances where this occurs are: 1) When vsync is enabled, 2) When the CPU is limiting the framerate (again, see that thread for more quantitativ details).
My tests found microstutter in a wide range of multi-GPU setups. The amount of microstutter varies, but typically takes an effective 10-25% from the "true" framerate. Unsurprisingly, triple and quad GPU setups suffer from much higher amounts of microstutter.