• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Inside the second: A new look at game benchmarking

Soldato
Joined
7 May 2006
Posts
12,183
Location
London, Ealing
New methods uncover problems with some GPU configs
by Scott Wasson — 6:08 PM on September 8, 2011

I suppose it all started with a brief conversation. Last fall, I was having dinner with Ramsom Koay, the PR rep from Thermaltake. He's an inquisitive guy, and he wanted to know the answer to what seemed like a simple question: why does anyone need a faster video card, so long as a relatively cheap one will produce 30 frames per second? And what's the deal with more FPS, anyway? Who needs it?

I'm ostensibly the expert in such things, but honestly, I wasn't prepared for such a question right at that moment. Caught off guard, I took a second to think it through and gave my best answer. I think it was a good one, as these things go, with some talk about avoiding slowdowns and maintaining a consistent illusion of motion. But I realized something jarring as I was giving it—that the results we provide our readers in our video card reviews don't really address the issues I'd just identified very well.

That thought stuck with me and began, slowly, to grow. I was too busy to do much about it as the review season cranked up, but I did make one simple adjustment to my testing procedures: ticking the checkbox in Fraps—the utility we use to record in-game frame rates—that tells it to log individual frame times to disk. In every video card review that followed, I quietly collected data on how long each frame took to render.

Finally, last week, at the end of a quiet summer, I was able to take some time to slice and dice all of the data I'd collected. What the data showed proved to be really quite enlightening—and perhaps a bit scary, since it threatens to upend some of our conclusions in past reviews. Still, I think the results are very much worth sharing. In fact, they may change the way you think about video game benchmarking
http://techreport.com/articles.x/21516

Its also got Micro stuttering tests.
 
Last edited:
Interesting read, it's nice to see some feedback from AMD/Nvidia admitting they do know about the issue and aren't completely ignoring it though. It also backs up the strength in going for one stronger GPU rather than 2 if you can help it; although it'd be interesting to see if they do a followup with TriSLI/TriFire to see if they see similar stutter/lag reductions as compared to Tom's review a month or two back.

That said, this report also makes SLI look worse than people would suggest, even though its often recommended over Crossfire.
 
Nice to see someone finally putting time into investigating this in more detail its something very relevant imo especially to multi GPU.

That said, this report also makes SLI look worse than people would suggest, even though its often recommended over Crossfire.

I don't think either are _that_ bad for the most part - a lot of the time its just the percieved framerate is lower than the metered framerate, its only really at extremes that jitter becomes noticeable except in the odd application/game. AMD have bias towards higher performance = generally higher percieved framerate/benchmark results but does mean your more likely to run into noticeable jitter if its going to occur, nVidia has bias towards smoother performance at a small performance penalty hence:

"third, in our test data, multi-GPU configs based on Radeons appear to exhibit somewhat more jitter than those based on GeForces"

and hence my posts awhile back on the subject which I took a lot of flack for *suprise*.
 
Last edited:
I don't think either are _that_ bad for the most part - a lot of the time its just the percieved framerate is lower than the metered framerate, its only really at extremes that jitter becomes noticeable except in the odd application/game. AMD have bias towards higher performance = generally higher percieved framerate/benchmark results but does mean your more likely to run into noticeable jitter if its going to occur, nVidia has bias towards smoother performance at a small performance penalty hence:

"third, in our test data, multi-GPU configs based on Radeons appear to exhibit somewhat more jitter than those based on GeForces"

and hence my posts awhile back on the subject which I took a lot of flack for *suprise*.

Oh I'm not trying to say AMD are perfect at all here, I'm actually running at 5870m crossfire setup (and luckily, going off Tom's report, the 5*** series aren't affected as badly as some), so I'm aware of what it does, and it is noticeable at times - usually more in the fact that the frame rate just doesnt seem as smooth as the figures would suggest, or there's a slight judder you wouldn't expect :)

My comment was actually based off the 560ti results; I figure this is the sort of SLI setup a lot of mid-range enthusiasts would opt for, and it seems to be the worst hit in the tests, although it's a shame the article couldn't test with three cards, or alternatively a wider range of cards from different generations. The issue as you state is that the jitter will become more apparent the lower the framerate, and unfortunately its these mid-range enthusiast setups which are likely to be the more common due to relative cost differences, both in crossfire and SLI that seem to be bear a higher brunt of the issues :(
The report also suggest Nvidia's SLI is far and away the golden child compared to Crossfire that some would suggest.

I know I probably come off occasionally AMD-biased from my feedback around this forum when I join a conversation, but I'm actually fairly flexible, my last machine before this had Nvidia SLI, and I'm running Nvidia as a preference in several HTPCs, as think both parties are as bad as each other in some ways! :)
 
Last edited:
Ah yeah - closer you get to 30fps the more this is an issue - I use my 470 SLI setup to get very high framerates for my 120Hz TFT so its rare its not silky smooth :D

If your aiming for minimum fps above 45 or so then multi GPU does do the job pretty well - but if your having to use multi GPU just to get framerates above 30 or stuck in the 20-30s even with a high end multi GPU setup then your not really getting a huge realworld advantage over a single card. If your getting ~30-40fps or so with a single card and using multi GPU to get back to a fairly consistant 60fps then it does mostly work pretty well. Higher mid-range cards usually work pretty well lower mid-range can potentially be more of an issue.
 
Last edited:
Back
Top Bottom