• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Tempted by some cheap GTX460 SLI action? Might want to check this out first before laying down your

This, I see no complaints of stuttering with 460 SLI on here or anywhere else, so the thread title has no relevance other than to bait people. Picks a vid of youtube using a setup and res that 99% of 460 users will never use, who's to say that the 460's were not over their 1GB buffer thus causing hitching, here's a vid showing 460 SLI in action with crysis, butter smooth, fail thread.
Agreed. It's old news that multi GPU can have stuttering problem comparing to single GPU, and the problem is not only limited to GTX460, but it affect the ATI cards too (if not more so), so he should stop trying to make it sound like a "Nvidia issue".

Also, using unrealistic high res settings example which nobody game at with that level of card to exaggerating the problem and giving subtle messages "Nvidia suck balls, ATI own its arse", he's just making himself seem like a troll/attention seeker. Going by his logic, nobody should bloody get a 5970 or CF5850/5870...unless he thinks ATI cards are immune to micro-stuttering problem.
 
Tbh i can see a tiny amount of stutter in that video.. If thats what all the fuss is about then i,m kind of shocked...Its hardly a deal breaker...... lol...

@60 FPS it's not that much of an issue sure but it's still present and get's much uglier when the FPS drops. Much better to get a faster single GPU that's smoother and cheaper like a 470 or 5850 IMHO.
 
Last edited:
Agreed. It's old news that multi GPU can have stuttering problem comparing to single GPU, and the problem is not only limited to GTX460, but it affect the ATI cards too (if not more so), so he should stop trying to make it sound like a "Nvidia issue".

Also, using unrealistic high res settings example which nobody game at with that level of card to exaggerating the problem and giving subtle messages "Nvidia suck balls, ATI own its arse", he's just making himself seem like a troll/attention seeker. Going by his logic, nobody should bloody get a 5970 or CF5850/5870...unless he thinks ATI cards are immune to micro-stuttering problem.

WTF? Have you actually read my posts?
Look like I'v got a militant Nvidiot biting on the end of the line...
 
WTF? Have you actually read my posts?
Look like I'v got a militant Nvidiot biting on the end of the line...
You got nobody to blame but yourself wording the title like that, delibrate or not. Regardless of the content of the thread, it is troll brait it is as simple as that. Like a few people already mentioned, should have titled it as a general micro-stutter issue, than making it sound like a specific problem to the GTX460 SLI.

Regardless of your intention, the title and the video in the first post pretty much killing the meaningfulness/usefulness of this thread, as it don't reflect the actualy conditions and settings that people use for gaming. Also, no matter how good the presentation and its contents are, misleading title that don't reflect to the root of the issue/subject pretty much make it pointless.
 
Last edited:
You got nobody to blame but yourself wording the title like that, delibrate or not. Regardless of the content of the thread, it is troll brait it is as simple as that. Like a few people already mentioned, should have titled it as a general micro-stutter issue, than making it sound like a specific problem to the GTX460 SLI.

Regardless of your intention, the title and the video in the first post pretty much killing the meaningfulness/usefulness of this thread. No matter how good a presentation is, misleading title for the presentation would pretty much make it pointless.

It's not a mis-leading title, for the simple fact it's an issue that DOES effect 460SLI which is currently a very popular choice for the uniformed.
My first post mentioned the issue being with both SLI and CF setups so I think you and your pal Raven need to take a step back for a moment.

And please, try not to be so brand sensitive...
 
And please, try not to be so brand sensitive...
I do apologise if I came on you a bit hard, but while you might not notice, on this forum lots of ATI fanboys like to trash Nvidia every chance they get, so instead of thinking I'm oversensitive, it might do you well to make your post as neutrally worded as possible and don't give them the chance/excuse to troll.
 
Last edited:
It's not a mis-leading title, for the simple fact it's an issue that DOES effect 460SLI which is currently a very popular choice for the uniformed.
My first post mentioned the issue being with both SLI and CF setups so I think you and your pal Raven need to take a step back for a moment.

And please, try not to be so brand sensitive...

gtx 460 sli is better than single cards like the 5870/470/5850 even with the micro stuttering considered. you are not convincing anybody here.
 
I do apologise if I came on you a bit hard, but while you might not notice, on this forum lots of ATI fanboys like to trash Nvidia every chance they get, so instead of thinking I'm oversensitive, it might do you well to make your post as neutrally worded as possible and don't give them the chance/excuse to troll.

Wow, stop sounding so butthurt. Even if people are "trashing" nVidia, it doesn't make your cards suddenly crap does it? I seem to recall you being quite happy to do the same about ATi. Just, for a moment, compare the type of "bashing" you get from "either side". From the nVidia boys it's generally "nVidia is the best, I love nVidia so much, they never have any problems, their drivers just work and they're great, ATi sucks and their drivers don't work, ever, under any circumstances, PhysX is amazing, it's so realistic, I love 3D because it's nVidia, and if you don't like 3D it's because you're against nVidia, CUDA is amazing too, even though I don't quite get what it's for".

When you get people trashing nVidia, you never see them saying "I love ATi so much, they're so great and amazing, nVidia suck, their cards are crap and their software is rubbish". What you do see is people making fun of the nVidia boys who go on about how much they love nVidia, or how nVidia "Just works", or how they don't like nVidia as a company for the various dodgy things they've been involved with, and or treating their customers like idiots.

And if you think I'm over-exaggerating, it was only the other day that I saw some one going on about how much they love nVidia and how nVidia "just works". So yeah, in basic terms, when a large group of people go on about things that simply aren't true, it's to be expected that they'll be made fun of. I find the ATi bashing tiresome because it's always the same "drivers don't work, ever", "inconsistent frame rates", "vague statement of how much better nVidia is that doesn't make any sense which makes any one with an ATi card a troll".

When nVidia stop being sneaky, and rethink their ways (constantly massive GPUs and really high prices) I'd consider them again, but conveniently enough for me, ATi have had them on value for money for a good few years now.
 
Last edited:
I agree that the thread title is little on the edgy side knowing how defensive people can be about a company :confused: but having read the thread I can see what the OP was going for due to the recent price cuts of the 460's. All those claiming this thread is biased against Nvidia obviously haven't read it very carefully as it's stated many times this isn't manufacture dependant.

I think this is valid information and might change the choices I make in the future so thank you to the OP, not only for explaining microstutter so even I could understand but also pointing out the effect that it can have on framerate.
 
Experience is subjective, and dependant on the game as well, some games are smooth at 30FPS, some aren't.
I quite agree with this. I was playing the early part of Crysis "contact" sp level last night. Chose high settings no anti-aliasing, 1920x1080. At 60, 80, even 100 fps my experience was not good at all, i'm perplexed as to why?. I put vsync on and i have about 24 fps. Maybe i'm not setting up my game properly?. But you are quite right.
 
gtx 460 sli is better than single cards like the 5870/470/5850 even with the micro stuttering considered. you are not convincing anybody here.

Maybe your right about that and it makes no difference to me if anyone is 'convinced' or not lol, but I personally would take a faster single GPU over two slower ones any day of the week in order to enjoy smoother gaming minus the stutter + occasional poor scaling.
 
Wow, stop sounding so butthurt. Even if people are "trashing" nVidia, it doesn't make your cards suddenly crap does it? I seem to recall you being quite happy to do the same about ATi.
Well, you recall wrong. I've said before that both HD5000 series and the GTX400 series are both have their own shares of merits and limitations, and it's probably worth waiting to see how the cards from both camps will improve from 2nd gen dx11 and onward. I don't know what happen during the dx10 gen on this forum (not joined yet at that time), but ever since I joined here early this year, I've seen lots of people one-sided kicking Nvidia like a dog...I was getting a bit tired of people trolling all the time.

People that own ATI cards complaining about ATI drivers, that make sense; but what is annoying is people (well, ATI cards owners) attack Nvidia over cards that they don't even own...WTH!? If they like ATI cards, then just get ATI card...Nvidia cards good or bad got nothing to do with them; like if you don't like someone, just walk away from him or ignore him, there's no need to go and up to him and attack him with any weapon you can get your hands on like some sort of bully. Personally, I think this so-called ATI vs Nvidia war is as stupid and meaningless as the XBox360 vs PS3 console war.

Oh, and let's not forget the common rumours of Nvidia optimised games equal to crippled ATI performance. Just yesterday I was just reading someone posting something like "Since Crysis II is going to be Nvidia optimised, so it would probably kill my Crossfire performance"...:eek:

Just curious...which is more likely...Nvidia invest time and resource working with game developers to make their hardware runs better on them, or make the games run poorly on ATI hardware? I'm not familiar with the laws, but surely they could get into trouble if it was latter?
 
Last edited:
Just curious...which is more likely...Nvidia invest time and resource working with game developers to make their hardware runs better on them, or make the games run poorly on ATI hardware? I'm not familiar with the law, but surely they could get into trouble if it was latter?

Umm asking that question is going start a big discussion and derail this thread.
I would suggest maybe starting a different thread if you feel the need, and doing some research before hand.
 
Threw this together in a few mins (well couple of hours) so bit rough and ready and the "microstutter" detection is very rudimentary at this point and I've probably screwed up some timing between local benchmark time and actual time somewhere :D

http://aten-hosted.com/files/msviz.exe

Same deal as the other proggie - run a fraps frametime benchmark, rename to input.csv, stick in the same folder and run the exe. Gives a visual representation of the benchmark - the grey rotating cube and the rectangle that follows the mouse pointer are both governed by the frame updates in the benchmark so you can see/feel for yourself - proggie attempts to detect actual human noticeable microstutter*. When I've got my head around the microstutter stuff properly I'll update it so you can run samples from single and multi GPUs side by side for comparisons.

Escape quits.

EDIT: Sometimes it flashes up microstutter when you have very obvious stutter - thats because both a proper stutter/interupt and microstutter ocurred in the same sample(s).


* Where the dispersion of updates within a second noticeably skews it from the "shown" framerate i.e. not cases where single GPU/multi GPU would both show say 60fps but on the multi GPU its "equivalent" of 57fps.
 
Last edited:
While I'm glad to see someone else putting effort into this, and a visualisation will be very useful, you need to be careful about how you quantify the microstutter. So based on what I have seen, here are a few pointers:


1. Make sure you are using frameTIME interval variation, rather than frameRATE variation to quantify the microstutter, otherwise you introduce bias into your computations. Consider (for example) the 'worst-case' limit of simultaneous frame output (the "worst-case" scenario); i.e. when two frames are output together, with a double gap afterwards. Here the apparent framerate is clearly half the shown framerate. BUT if you are taking frameRATE variations, the time between the two simultaneously output frames is zero, so the framerate variation is infinite, and your apparent framerate is computed as zero.

2. You need to compute a local frametime against which to quantify the frametime variation (or dispersion as you put it). This can't be over a period such as a second, because a) There can be far too much natural variation over the course of a second, which will show microstutter where there is none, and b) That quantity scales with framerate. You need to average over a constant number of frames in either direction. I found that taking 4 to 7 frames on either side gave stable results. You can apply a hat filter (uniform weighting given to each included frametime), or you can apply a weighted function (Gaussian works well). If you apply a weighted function, use a slightly larger local stencil.

3. Your local frametime variation must be non-dimensional. That is, you need to divide the frametime variation by the frametime itself, to obtain a proportional value; a number which does not scale with increasing or decreasing framerate.

4. The apparent framerate at any instant in time is best represented by: A = FPS/(1+delta), where FPS is the smoothed local framerate, and delta is your *proportional* variation from the local mean (changing frame-by-frame). Delta will be a number between zero (no microstutter) and one (simultaneous frame output).

5. Since you are showing the benchmark in real-time, which is useful, you could keep a running average of the microstutter index (the delta above), and the apparent framerate (A above). THIS would be useful to be averaged over a second, so we can actually catch the numbers.

6. To nullify the issue of "real" stutter, cull the few largest frametime variations. This is more difficult to do "on the fly" than over a static benchmark, but you could (for example) ignore any consecutive frametimes where one frametime interval is more than twice the previous one. Under normal "hitching-free" scenarios this would never occur.


But anyway, nice work. It's good to have a visual aid to express microstutter, but just make sure you quantify the variation in a consistent, scalable and non-dimensional way, or the data is meaningless.
 
Back
Top Bottom