• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why framerates don't matter: Frame Rating Dissected

bit late to the party, lots of other reviewers have been doing frametime comparisons for quite a while now

anyway, it's all wrong because it shows 680 SLI performing better than 7970 Xfire which everyone on here says is not possible :D

or in other words, I like this review because it confirms my perception that AMD multi card drivers are pretty broken... still :D

edit; actually I take it back, he does appear to be using a method that others don't which is new and interesting, not sure if it's more accurate though
 
Last edited:
I stopped reading after "If you have a steady frame rate of 25 FPS you can still have an enjoyable experience (as evident by the 24 FPS movies we all watch at the theater)"

Tho from a quick glance over the rest theres some interesting technical data but I can't take anyone seriously who makes such a claim in regards to PC gaming if for no other reasons before getting into the more complex stuff:

-You don't have input control on a movie and low framerate and how it impacts on an experience is much more noticeable when your in control than when your a passive observer.

-Movies smooth out motion due to blurring and it doesn't work that way in games.


EDIT: Also while I didn't read the data in depth I'd have thought that capturing on the hardware side with a device thats ticking over at a fairly regulated 60Hz isn't going to be very good for low level examination of frametimes.

EDIT2: While I don't want to put my foot in it and don't have time to examine it in detail at the moment there seems to be a huge number of flaws in that article from casual observation.
 
Last edited:
I've recently become a 120hz vsync snob, so the thought of trying for an enjoyable gaming experience at 25fps just makes me cringe.
 
I stopped reading after "If you have a steady frame rate of 25 FPS you can still have an enjoyable experience (as evident by the 24 FPS movies we all watch at the theater)"

Tho from a quick glance over the rest theres some interesting technical data but I can't take anyone seriously who makes such a claim in regards to PC gaming if for no other reasons before getting into the more complex stuff:

-You don't have input control on a movie and low framerate and how it impacts on an experience is much more noticeable when your in control than when your a passive observer.

-Movies smooth out motion due to blurring and it doesn't work that way in games.


EDIT: Also while I didn't read the data in depth I'd have thought that capturing on the hardware side with a device thats ticking over at a fairly regulated 60Hz isn't going to be very good for low level examination of frametimes.

EDIT2: While I don't want to put my foot in it and don't have time to examine it in detail at the moment there seems to be a huge number of flaws in that article from casual observation.
I normally stop reading at that same point in articles/claims.
Another popular one is that the human eye sees motion in frames/second - 30fps max to be precise.
 
Last edited:
bit late to the party, lots of other reviewers have been doing frametime comparisons for quite a while now

yeah i realise this but the article seems very well written and explained to me, so it's worth a read

I stopped reading after "If you have a steady frame rate of 25 FPS you can still have an enjoyable experience (as evident by the 24 FPS movies we all watch at the theater)"

you've never met a console gamer before?? lots of people have different tolerances for this so dismissing the article as it doesn't fit your opinion is just stupid
 
Last edited:
you've never met a console gamer before?? lots of people have different tolerances for this so dismissing the article as it doesn't fit your opinion is just stupid

Well I think he meant the direct comparison to gaming FPS to tv/movies FPS, and while console games do dip to these levels they are generally not at 24 FPS the entire time (wouldn't surprise me if some were though).

Unfortunate really, I enjoy gaming on consoles and PCs but the framerates on some games are just bad. But I guess if you've never seen better you'll never know :)
 
you've never met a console gamer before?? lots of people have different tolerances for this so dismissing the article as it doesn't fit your opinion is just stupid

Granted a console game at mostly 30fps on a controller is going to be a different story and less demanding on framerate for a good experience than a PC game on keyboard and mouse but the review is looking at high end PC graphics cards so I don't think thats really relevant.

Point still stands tho IMO from my experience you need around double that movie framerate when your interacting with the scene to get both the look and feel of fluidity that you do from passive observation.

EDIT: Taking some time to read through in a bit more detail theres some interesting technical data in there and some stuff that backs up what I've been claiming for awhile on the subject but theres also a few issues where the hardware capturing just doesn't have the granularity to tell the whole story.
 
Last edited:
In page 11 they show that crossfire with vsync enabled fixes the issues as the runts are not present as long as your not cpu limited.

There was a caveat tho "Turning on Vsync does help AMD's CrossFire performance but it isn't the final answer just yet." it helps but it shifts some of the problems elsewhere. Looks like nVidia's frame metering, which is rarely ever talked about, is actually a fairly good solution.

As I mentioned theres a fair few flaws in their testing, etc. but this won't make good reading:

"Where AMD has definite issues is with HD 7970s in CrossFire, and our Frame Rating testing is bringing that to light in a startling fashion. In half of our tested games, the pair of Radeon HD 7970s in CrossFire showed no appreciable measured or observed increase in performance compared to a single HD 7970. I cannot overstate that point more precisely: our results showed that in Battlefield 3, Crysis 3 and Sleeping Dogs, adding in another $400+ Radeon HD 7970 did nothing to improve your gaming experience, and in some cases made it worse by introducing frame time variances that lead to stutter"
 
There was a caveat tho "Turning on Vsync does help AMD's CrossFire performance but it isn't the final answer just yet." it helps but it shifts some of the problems elsewhere. Looks like nVidia's frame metering, which is rarely ever talked about, is actually a fairly good solution.

As I mentioned theres a fair few flaws in their testing, etc. but this won't make good reading:

"Where AMD has definite issues is with HD 7970s in CrossFire, and our Frame Rating testing is bringing that to light in a startling fashion. In half of our tested games, the pair of Radeon HD 7970s in CrossFire showed no appreciable measured or observed increase in performance compared to a single HD 7970. I cannot overstate that point more precisely: our results showed that in Battlefield 3, Crysis 3 and Sleeping Dogs, adding in another $400+ Radeon HD 7970 did nothing to improve your gaming experience, and in some cases made it worse by introducing frame time variances that lead to stutter"

I am sure they could have it running perfectly though. A few guys here would give them the needed tips to solve those issues.
 
try strafe jumping in many games with under 125 fps :p gg

ill take the pepsi test in any game from 25-60 fps . its easy to notice .
 
This could explain why AMD are working on drivers that give you the choice of smooth gameplay or more fps.

Remember that the Observed FPS takes out runts and drops and is the result and benefit of measuring performance with our new Frame Rating system. Clearly the advantages that CrossFire appeared to have over SLI in the first result nearly completely negated and in many cases the observed performance of a two card HD 7970 configuration is no better than that of a single card.

The same cannot be said for NVIDIA’s SLI though – the frames are presented to the gamer in a consistent pattern that indicates good scaling and that looks nearly identical to that of the first FRAPS-based graph.

Even though the GTX 680 is at the bottom of the pack in our percentile chart of minimum frame rates, the HD 7970 and HD 7970 CrossFire again show the lack of scalability of the CrossFire configuration. And it should give GeForce owners some pride that their single GTX 680 barely slower than a pair of Radeon HD 7970s in real-world performance.

As I mentioned before, there are some cases where the 5760x1080 results for AMD CrossFire + Eyefinity were so bad, with so many dropped frames, that the Perl code in FCAT couldn’t produce a solid result. With Battlefield 3, this is one of those cases so you will not get any HD 7970s in CrossFire in our graphs here. What I can tell you though is that our video footage of the 57x10 EF+CF testing did show a dropped frame on every other color, telling me that the work of one of the GPUs was 100% thrown away and never shown to the gamer.

http://www.pcper.com/reviews/Graphi...ils-Capture-based-Graphics-Performance-Test-4

run_stats__3.png


Here again is another one of our RUN files to show you the affects of dropped frames on Eyefinity testing. FRAPS based frame rates sky rocket up though the observed frame rate is much lower, in line with a single HD 7970 GHz Edition card. There are some runts involved in this but the biggest factor is obviously the dropped frames (missing colors in our pattern).

run_stats__5.png


Again, for comparison, here is the RUN graph for the GTX 680s running in SLI at 5760x1080. Notice that the frame rate is consistent with no drops or runts.
 
Last edited:
nVidia seem to be wading in on this http://blogs.nvidia.com/2013/03/with-our-new-tool-what-you-see-is-what-you-get/ be interesting to see where this leads and also wondering if its by accident the timing coincides with the (proper) 7990.

Funny that Nvidia think that people need to analyse frame rates differently, when I still get (occasionally) the stutter bug with the latest drivers in BF3.

I have turned of adaptive vysnc and so far so good. Also to my surprise I have not seen any tearing in any game I have played yet... when my monitor is very sensitive to tears.

So that's a nice surprise.
 
Why framerates don't matter? The more frames per second in games i have played make my games a better experience and often run smoother, mind you this is with a single gpu so yeah framerates do matter, the more the merrier i think. Maybe this is just sli/cfx issues?.
 
Why framerates don't matter? The more frames per second in games i have played make my games a better experience and often run smoother, mind you this is with a single gpu so yeah framerates do matter, the more the merrier i think. Maybe this is just sli/cfx issues?.

Basically yes, what this article shows is that there is a mismatch sometimes between framerates being reported by the graphics card, and what gets displayed on the actual monitor... the biggest mismatch seems to be with Crossfire, where every other frame (pretty much) has a display issue and never gets fully displayed, effectively halving the actual frame rate on the display

SLI suffers from this effect far far less than crossfire, according to this test method
 
Funny that Nvidia think that people need to analyse frame rates differently, when I still get (occasionally) the stutter bug with the latest drivers in BF3.

I have turned of adaptive vysnc and so far so good. Also to my surprise I have not seen any tearing in any game I have played yet... when my monitor is very sensitive to tears.

So that's a nice surprise.

both sides have had problems with vsync for quite some time

I must admit that I'm quite lazy and tend to leave it on, but for people that have issues, on either side of the fence, knocking off render ahead and vsync and setting an FPS limit instead seems to be a better solution
 
Back
Top Bottom