• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is micro stuttering still a problem?

Associate
Joined
24 Jan 2007
Posts
1,490
Location
Guildford
Is micro stuttering still a problem?

I currently have a GTX 560Ti and I am wondering should I upgrade to a GTX 580 or buy another GTX 560Ti? Do SLI setups still suffer from microstutter? I am very sensitive to that sort of thing and would prefer to give up 10-20 FPS if it meant having a smooth frame rate. So please, no fan boy rubbish, just straight answers. If there are any articles proving it is gone that would be good also.

Thanks.
 
Custom PC have an article in the September 2011 issue to find the fastest hardware.

As part of the GPU test they looked at the HD 6990 and GTX 590.

They concluded by saying "Unfortunately neither dual-GPU card could play Bad Company 2 smoothly; both cards still exhibited lots of annoying micro-stuttering despite good average frame rates".

They then tested a pair of GTX 580's to see if that would make a difference. They also suffered from lots of micro-stuttering.
 
Is micro stuttering something that only some people can see ??


As i never notice it while gaming with my SLI setups :confused:

I don't notice it with my Crossfire setup either.

Perhaps, as the OP says, some people are very sensitive to it.

I just quoted the artice so that the OP is aware of the situation.
 
It also depends on the setup.

2 set-ups with the same cards & have different Microstutter characteristics.

I did notice some in BFBC2 when on the 10.5 drivers, but they didn't seem to use QuadFire properly when checked with Afterburner & it was a bit like when COD4 MW came out & had the Microstutter with multi 3870 until the drivers fixed it.
But now i don't notice any Microstutter in BFBC2 & i even play it with Vsync off at times.
 
Last edited:
microstuttering happens when the fps drops way below 40fps.
if you run it above that, it works just fine.
I run 6850/6870 crossfire in BC2 but I max out fps in 5040x1050 average around 100fps or so, never seen it.
I would likely see it if i cranked up settings but why do that?
fps games isn't about eyecandy
 
^^ Microstutter happens all the time, just most of the time is below human perception until you drop below ~30fps.

I've been using multi GPU setups since forever, understand microstutter (sort of) and the kind of person who sees a benefit from 120Hz panels and former semi-pro competitive quake player so fairly sensitive to inconsistant rendering, etc. and I've not seen much in the way of true noticeable microstutter issues in a long time with properly setup SLI systems and its mostly not an issue with properly setup crossfire either - tho I've seen some issues from it with 58xx CF in some games like metro 2033. A lot of things are incorrectly attributed to microstutter when its not actually classic microstutter. Only time I've noticed microstutter with my GTX470 SLI setup is with heaven benchmark and when using an SLI setup to do rendering and physx in Mafia 2 - using a dedicated GPU for PhysX or using CPU PhysX resulted in a uniform enough framerate output that microstutter wasn't an issue.

Microstutter was only really an issue with early 38xx CF setups and some Crossfire X setups with badly mis-matched cards both scenarios were fixed with driver updates, unfortunatly a certain part of the community felt the need to tar all multi-GPU setups with the microstutter brush at the time, which wasn't helped by some designs of the 9800GX2 and early design 8800GT SLI suffering some stuttering issues due to cooling problems. The only setup since that has had real issues with microstutter was the 4870X2 which I think was due to AMD early on trying to do something "new" with multi GPU rendering technology and then deciding it wasn't worth it, but the hardware still suffers some legacy issues due to it.
 
Last edited:
^^ Not all the time. The only obvious microstuttering I saw was in the opening scene of Napoleon Total War land battle tutorial. It happened whenever I upgraded Catalyst to 10.5 or later (so if I stick with 10.4 or earlier I don't have that problem).
 
Microstutter is not something that you will "notice" just by looking at the screen, unless the framerate is already low (low enough that you can catch the irregularity in frame output), but that doesn't mean it has no effect. When the framerate is higher, the effect of irregular frame output is an apparent reduction in smoothness (since the key factor in fooling the eye into believing a series of pictures are a fluid moving image is the maximum gap between frames). The end-product effect is similar to just dropping the framerate a little.

Anyway, I've written loads of stuff on these forums and others about microstutter (see this thread) if you want to search for it. I also wrote a program to quantify the amount of microstutter in a benchmark taken with FRAPS, so you can find out for yourself easily enough. Anyway, I'll just quote one of my previous posts but there is plenty more around if you're interested.


I've written more than I ever intended about microstutter (see this thread for example, or search for any number of lengthy discussions on these forums). But, in short:

Microstutter is the name given to irregular frame output. With two or more GPUs working in the "alternate frame rendering" (AFR) mode that both Nvidia and AMD use, frames are not always output evenly. You can get one frame, then a very short gap until the next frame, then a longer gap until the next one, and so on.

* This effect makes the game seem less smooth for a given framerate than if the frames were output evenly.

* Unless the framerate is very low, so that you can see individual frames, you won't be able to look at a game scene and say "hey this is microstuttering!" - it simply makes your game scene appear less smooth for a given framerate. Or, conversely, you need a slightly higher framerate for the game to seem as smooth as with a single GPU. This is the main reason for so much misunderstanding about microstutter.

* The amount of microstutter can vary significantly from game to game. But, in most circumstances that it occurs, you're looking at an effective reduction in smoothness equivalent to around 10-25% in comparison to a regular frame output (see the thread I linked to for quantitative details).

* Bear in mind that the amount of performance you will gain from adding a second card will almost always be much larger than the 10-25% effective drop from microstutter. In almost every case, you DO get an improvement in smoothness from adding a second card. So, it's well worthwhile.

* ...The real question comes when considering dual-GPU setups of low-end cards when a single high-end card can offer similar performance. In these cases, the effectove value of the dual-card setup is somewhat less than benchmarks may lead you to believe (since they only measure the raw number of frames output and take no account of "smoothness" effects caused by irregular frame output). In these cases it is often better to consider a single higher-end GPU.

* Microstutter disappears almost entirely whenever the GPU is made to wait between frames. In these circumstances the output from the GPUs syncs up to the regular output of whatever is holding them back. The most common two circumstances where this occurs are: 1) When vsync is enabled, 2) When the CPU is limiting the framerate (again, see that thread for more quantitativ details).

My tests found microstutter in a wide range of multi-GPU setups. The amount of microstutter varies, but typically takes an effective 10-25% from the "true" framerate. Unsurprisingly, triple and quad GPU setups suffer from much higher amounts of microstutter.
 
Last edited:
Microstutter is not something that you will "notice" just by looking at the screen, unless the framerate is already low (low enough that you can catch the irregularity in frame output), but that doesn't mean it has no effect. When the framerate is higher, the effect of irregular frame output is an apparent reduction in smoothness (since the key factor in fooling the eye into believing a series of pictures are a fluid moving image is the maximum gap between frames). The end-product effect is similar to just dropping the framerate a little.

Anyway, I've written loads of stuff on these forums and others about microstutter (see this thread) if you want to search for it. I also wrote a program to quantify the amount of microstutter in a benchmark taken with FRAPS, so you can find out for yourself easily enough. Anyway, I'll just quote one of my previous posts but there is plenty more around if you're interested.


I've written more than I ever intended about microstutter (see this thread for example, or search for any number of lengthy discussions on these forums). But, in short:



My tests found microstutter in a wide range of multi-GPU setups. The amount of microstutter varies, but typically takes an effective 10-25% from the "true" framerate. Unsurprisingly, triple and quad GPU setups suffer from much higher amounts of microstutter.

Nice explanations there! Thanks!
 
Yeah I used duff-man's program originally on my setup testing some of the games I play regularly and aside from heaven and a couple of others, most games the level of microstutter was in the 4-7% range and really not noticeable when your talking a percieved say 65 fps against a recorded 68fps.
 
Well I have done some reading around and as much as I would rather save my money and get more "performance" from 2 560s in SLI I just don't want to risk it. I am extremely sensitive to any sort of lag, slow refresh rate etc - always have been. I think I will just get the fastest single GPU card that I card. Seeing as though I am mainly upgrading for BF3, and BC2 appears to suffer from a lot of microstutter problems I don't want to risk it considering it's the same "engine" so to speak (Frostbite > Frostbite 2).

On top of that, I can't be bothered with all the profiles and late drivers. I love taking part in BETA games and so on and it would also annoy me having to wait weeks to months on new drivers to enable my SLI setup to work properly with every new game that is released.

I think a single GPU setup appears to be a safer, hassle free setup. It might cost more overall but if I spend another 170 quid on a 560Ti and I do notice microstutter it will be money down the drain, at least if I buy a GTX 580 I am on a safe bet.

Thanks greatly for the replies guys :).
 
Running bfbc2 on sli 470's here, no problems with microstutter. Been playing the bf3 alpha trial, sli isnt utilised in this, im getting 40fps ono in it.
 
Microstutter is not something that you will "notice" just by looking at the screen, unless the framerate is already low (low enough that you can catch the irregularity in frame output), but that doesn't mean it has no effect. When the framerate is higher, the effect of irregular frame output is an apparent reduction in smoothness (since the key factor in fooling the eye into believing a series of pictures are a fluid moving image is the maximum gap between frames). The end-product effect is similar to just dropping the framerate a little.

Anyway, I've written loads of stuff on these forums and others about microstutter (see this thread) if you want to search for it. I also wrote a program to quantify the amount of microstutter in a benchmark taken with FRAPS, so you can find out for yourself easily enough. Anyway, I'll just quote one of my previous posts but there is plenty more around if you're interested.


I've written more than I ever intended about microstutter (see this thread for example, or search for any number of lengthy discussions on these forums). But, in short:



My tests found microstutter in a wide range of multi-GPU setups. The amount of microstutter varies, but typically takes an effective 10-25% from the "true" framerate. Unsurprisingly, triple and quad GPU setups suffer from much higher amounts of microstutter.

Interesting.

I have one gtx 460 and i7-2600K at 4.8ghz. Most games work smoothly at 60fps and some even up to 140fps at 1920x1200. VSync off.

Should I get another gtx 460 or upgrade it to a 580? My pc is watercooled, so I need to think about costs of the water blocks as well :/
I think 2 x 460 is slightly better than 1 x 580 theoretically, on paper, but what will it be in real? :D
 
Interesting.

I have one gtx 460 and i7-2600K at 4.8ghz. Most games work smoothly at 60fps and some even up to 140fps at 1920x1200. VSync off.

Should I get another gtx 460 or upgrade it to a 580? My pc is watercooled, so I need to think about costs of the water blocks as well :/
I think 2 x 460 is slightly better than 1 x 580 theoretically, on paper, but what will it be in real? :D

Only do so if you can find another 460 for cheap, because 1GB vram per GPU would make you gain only partial benefit from it due to lag spikes/stuttering due to vram shortage.
 
gtx460 sli user here! Bench marks and experiments aside, two cards in sli absolutely smokes a single card in general. don't let this put you off, adding a second card is a great value upgrade. Haven't been aware of any issues so far.I think some people blame other common issues on microstuttering, such as bad mouse acceleration, driver problems etc. not a well understood phenomenon.
 
Hey, I have the 2GB RAM version, so I'm watching one on amazon for about £155. Since march the price hasn't changed much. The waterblock will be some extra cost, about £80. I will need the backplate as well, for ram :/ The total cost will be around £250-300 :/
 
Interesting.

I have one gtx 460 and i7-2600K at 4.8ghz. Most games work smoothly at 60fps and some even up to 140fps at 1920x1200. VSync off.

Should I get another gtx 460 or upgrade it to a 580? My pc is watercooled, so I need to think about costs of the water blocks as well :/
I think 2 x 460 is slightly better than 1 x 580 theoretically, on paper, but what will it be in real? :D

If you're getting good performance right now, I'd be tempted to hold off until the 28nm cards arrive. Should be a pretty big jump in performance this time around.

But if you want to upgrade now... well, in terms of raw performance, 460SLI vs single 580 is pretty similar [see here], but you will get smoother gameplay with a single 580. It's a more expensive option though, so it's hard to say if it's worth it. If it were me I would just wait for the next gen - if you're going to watercool a card you want it to have a decent shelf-life.
 
Back
Top Bottom