• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** Stunning COD Advanced Warfare Benchmarks ***

Your using 2 seperate display methods, neither of which you really know how instantaneous or averaged they are
The display in the bottom left even says min max and average, none of those are current/instant

You can convert a frametime to an instantaneous fps reading, you know this as you know from above that 120fps is equivalent to a constant 8.3ms

By definition the minimum instantaneous fps recorded anywhere in a run HAS to be smaller than the equivalent largest frametime


When you convert a frame time to a frame rate you are assuming that the frames will continue to be delivered at the same rate but they are not. to get the frame rate from the frame times you need to know the exact number of frames that where rendered in that second and the time between each one.

its not a repetitive signal so its not as simple 1/t. what you are actually calculating is what the frame rate would be if it continued to be delivered at the same ms interval.
 
Found the command I think you was taking about Andy.. perfoverlay.drawframegraph

So give this a try and still my results was different..

201FPS OSD - 1000/4.73 = 211FPS Render Time

Frame%20times.png

Another try!

221FPS OSD - 1000/4.83 = 207FPS Render Time
Frame%20times1.png
 
When you convert a frame time to a frame rate you are assuming that the frames will continue to be delivered at the same rate but they are not. to get the frame rate from the frame times you need to know the exact number of frames that where rendered in that second and the time between each one.

its not a repetitive signal so its not as simple 1/t. what you are actually calculating is what the frame rate would be if it continued to be delivered at the same ms interval.

To create an average, yes, but by definition the MINIMUM frame rate experienced during the run should be equal to the largest frametime, if you are creating an average over any given whole second, then that isnt the minimum, it is the lowest average whole second... when you see frame rate graphs over time, are they averaged out per whole second or are they data points for every frame rendered? It is the latter.

Or, I'll remember your tip next time I'm pulled over by the police; "No officer, I wasn't doing 100 miles per hour because it was only for a few minutes, not a whole hour"

As for hardware pal, I now now that he uses two different methods to record his frametimes and his minmaxavg, his minmaxavg is taken directly from FRAPS, which records one value per second, so it wouldnt matter if the game spent 75% of its time at 30fps, if it just happened to spike to 90fps at the top of every second then the min and average would be around 90fps

No other review site uses FRAPS in this way, because it is so massively innacurate
 
Last edited:
Found the command I think you was taking about Andy.. perfoverlay.drawframegraph

So give this a try and still my results was different..

201FPS OSD - 1000/4.73 = 211FPS Render Time

Frame%20times.png

Another try!

221FPS OSD - 1000/4.83 = 207FPS Render Time
Frame%20times1.png

Then the BF4 fps counter is either averaged or suffers some kind of delay, either way it makes it unreliable, it isnt an instant current fps counter
 
To create an average, yes, but by definition the MINIMUM frame rate experienced during the run should be equal to the largest frametime, if you are creating an average over any given whole second, then that isnt the minimum, it is the lowest average whole second

As for hardware pal, I now now that he uses two different methods to record his frametimes and his minmaxavg, his minmaxavg is taken directly from FRAPS, which records one value per second, so it wouldnt matter if the game spent 75% of its time at 30fps, if it just happened to spike to 90fps at the top of every second then the min and average would be around 90fps

No other review site uses FRAPS in this way, because it is so massively innacurate

Hey andybird123 hey all you stunning guys and gals etc :)

I was a bit busy yesterday with Nvidias new 344.60 driver , which is lacking to say the least. I tried to do my own little tweaks with a few custom profiles and AFR 1/AFR 2 but with no luck.

I will be waiting this out (if Nvidia release a new driver) hopefully by then AMD will release a CFX driver to.

The Frame rate - frame time discussion is something thats been going on for years (the're a few more knowledgable people on the subject than me thats for sure) but I did a small little formula that "kind of" describes playability/smoothness but got a few emails from people not understanding what I mean

"To be able to pass our playability test the 99th percentile of frametimes must average at or lower than 16.7ms , the 0.1 percentile not spiking above 50ms aswell as the game not utilizing all Vram resources. Our own personal experience using these standards ensure no bottlenecks, visual lag or stuttering at 60hz. For some users higher or lower values may be tolerated depending on hardware setups and monitor refresh rates."
This was ofcourse with Shadow of Mordor so each game will feel different depending on your framerate, optimisation and drivers.

I will give a emphasis on *higher or lower values may be tolerated depending on hardware setups and monitor refresh rates* using an older monitor when getting 50ms frametime spikes (0.1 percentile) I did feel even that split second delay, while with my ASUS PB287Q I didn't.The 50ms ofcourse depends on the framerate your running a drop from 150fps to lets say 50fps will be noticeable while a drop from 40 to 27 might not.

There are to many variables when trying to calculate smoothness I know , Monitor refresh rates, Vsync on/off , monitor response time etc etc.

When mentioning minimum framerate from a percentile basis how would you interpret one graphics card with 30fps as a minimum lets call it "frametime.csv converted framerate"

GPU X
4999 frames rendered at or below 16.7ms (60fps average converted)
1 frame rendered at 33.ms (30fps converted)

Minimum Fps = 30fps


While GPU Y

4500 frames rendered at or below 16.7ms (60fps converted)
300 frames rendered at or below 33.3ms (30fps converted)

Minimum Fps = 30fps

Which GPU is better to get ?

I think you get the idea on how using a minimum of "frametime converted framerate" is imo somewhat flawed.

Theres no 1 size fits all analyses if you don't sit and play the game yourself its all subjective.
 
Last edited:
Yes, I inderstand your reasoning, that is why a few sites produce full frame time over time graphs, to let users see exactly how and where the spikes occur and how important they are

The fact is, you are presenting your data and saying "here is the min, max and average" but by definition your min and max are not min and max, they are "99% averaged min and max, removing bits of data I dont like". Your min max average chart is really just an average deviation chart that ignores actual min and max measurements.

There are lots of articles as to why the industry finds FRAPS to be unreliable, the method you are using being at the top of that list.
 
Last edited:
To create an average, yes, but by definition the MINIMUM frame rate experienced during the run should be equal to the largest frametime, if you are creating an average over any given whole second, then that isnt the minimum, it is the lowest average whole second... when you see frame rate graphs over time, are they averaged out per whole second or are they data points for every frame rendered? It is the latter.

Or, I'll remember your tip next time I'm pulled over by the police; "No officer, I wasn't doing 100 miles per hour because it was only for a few minutes, not a whole hour"

As for hardware pal, I now now that he uses two different methods to record his frametimes and his minmaxavg, his minmaxavg is taken directly from FRAPS, which records one value per second, so it wouldnt matter if the game spent 75% of its time at 30fps, if it just happened to spike to 90fps at the top of every second then the min and average would be around 90fps

No other review site uses FRAPS in this way, because it is so massively innacurate

that was the point of measuring frame times, because the framerate in fraps is per second so doesnt give a clean reading of whether the delivery of each frame is smooth. Reviewers used to use fraps before frametimes became more relevant.

Your car example does not apply to anything I have said and is a bit daft.

im glad you got the answer you wanted.
 
that was the point of measuring frame times, because the framerate in fraps is per second so doesnt give a clean reading of whether the delivery of each frame is smooth. Reviewers used to use fraps before frametimes became more relevant.

Your car example does not apply to anything I have said and is a bit daft.

im glad you got the answer you wanted.

You were saying that fps had to be a measure of a whole second because it includes the word "second", i was giving you a real world example of how a "something per something" unit of measure is used to give an instantaneous readout

Fps can be used as an instant readout, and in the case of "minimum" and "maximum", by the definition of the word, should be the slowest and fastest frame, not just the slowest and fastest frame ignoring the ones I dont like

If the reviewer wants to give a better idea of frame rate over time then they should use a line graph instead of simple bar graphs that have been doctored to present a subset of data, it is just a different button in excel, it isnt exactly more work

he is also using a single run, so there's no way of telling how repeatable the data is
 
Last edited:
OK 99% of the time Nvidia is smoother as there's less variance.
It may hitch worse at times, but 99% of the time it's smoother.

Although the Vortez article also says that you're unlikely to notice anything below 30ms frametime anyway.


EDIT:
Doesn't Helios1234 'work' for Vortez? Pity he doesn't see this as he might be able to explain it.


BjOI1lV.jpg


TI8hXFI.jpg

Just for reference on the discussion the above is the frametime.csv from Fraps run through Frafs bench viewer, while the below results are the framerate from Fraps min-max-avg csv

R9-290
Min: 81
Avg: 90.53
Max: 93

GTX970
Min: 88
Avg: 90.65
Max: 93

The R9-290 has very few frametime spikes over 18ms while the GTX970s has a lot more in comparison while rendering 7 frames more in the 59 second run (5430 vs 5437)

Spike Count +-
GTX970: 98 spikes above 16.7ms (Highest 28.7)
R9-290: 29 spikes above 16.7ms (Highest 18.7)
 
Last edited:
You were saying that fps had to be a measure of a whole second because it includes the word "second", i was giving you a real world example of how a "something per something" unit of measure is used to give an instantaneous readout

Fps can be used as an instant readout, and in the case of "minimum" and "maximum", by the definition of the word, should be the slowest and fastest frame, not just the slowest and fastest frame ignoring the ones I dont like

If the reviewer wants to give a better idea of frame rate over time then they should use a line graph instead of simple bar graphs that have been doctored to present a subset of data, it is just a different button in excel, it isnt exactly more work

I didnt say anything had to be anything. I was saying that if one method measures the number of frames delivered every second and another method measures the time between frames the results will be different. you were trying to convert one into the other and tried to get them to match. a lot happens in a second.

if for 12 hours you measured how many metres you travelled at 1 hour intervals and then calculated your min max and avg speed you would obviously have different results than if after every metre you measured your speed.

thats the difference between measuring the frame rate fps.(depends on sampling rate)
and measuring the ms between frames and calculating a fps reading.

I dont disagree with what you where saying it just depends on what value you are measuring.
 
Last edited:
I didnt say anything had to be anything. I was saying that if one method measures the number of frames delivered every second and another method measures the time between frames the results will be different. you were trying to convert one into the other and tried to get them to match. a lot happens in a second.

if for 12 hours you measured how many metres you travelled at 1 hour intervals and then calculated your min max and avg speed you would obviously have different results than if after every metre you measured your speed.

thats the difference between measuring the frame rate fps.(depends on sampling rate)
and measuring the ms between frames and calculating a fps reading.

I dont disagree with what you where saying it just depends on what value you are measuring.

fair enough

most respected reviewers have moved over to recording frame times and then calculating min max and average off of that, FRAPS is very misleading in that it calls something min and max when they are not that

others just use average and completely ignore mins and maxes

FRAPS allows you to record frame times, so it makes sense to use one data set when presenting data (there are several benchmarking best practice guides that tell you how to do this), mixing and matching two different data sets within one article is misleading unless you actually state your method clearly in the article
 
Then the BF4 fps counter is either averaged or suffers some kind of delay, either way it makes it unreliable, it isnt an instant current fps counter

Yeah the delay is per second vs milliseconds.. Ignore the fps that isn't my real performance, the counter at the bottom is..

So like I been saying they are two different things... An fps counter doesn't show how well a game is performing it just shows a raw frame every second.. To find out how well and smooth a games frame rate is you need to look into the frame latency and bf4 so far it one game that offers very well detailed information because it's coming right from the engines render.
 
A set of cards released a year after the other set are faster? OMG MUCH WOW

Seriously guys, stop spouting carp

Well if you look at it that way, the Titan is faster than the 290X, so you have a point in away. I don't think it is crap talk and I find frame times, frame variances and smoothness interesting.
 
Well if you look at it that way, the Titan is faster than the 290X, so you have a point in away. I don't think it is crap talk and I find frame times, frame variances and smoothness interesting.

Sorry Gregster, I didn't see past the first few posts. the rest is very informative indeed.
 
Yeah the delay is per second vs milliseconds.. Ignore the fps that isn't my real performance, the counter at the bottom is..

So like I been saying they are two different things... An fps counter doesn't show how well a game is performing it just shows a raw frame every second.. To find out how well and smooth a games frame rate is you need to look into the frame latency and bf4 so far it one game that offers very well detailed information because it's coming right from the engines render.

not all fps counters are averaged, it is important to know what you are looking at, but equally expressing a frame time as an FPS measurement is not invalid, it is just another way of expressing it

the problem with the hardware pal review / graphs is that showing frame times makes you think he is doing it the proper way, when really his min/max/avg graph is doctored

it is a glaring error and not one I've seen a "reviewer" make before, as I said above, that is not how most reviewers do it anymore

Average fps and instant fps are 2 different things, it is disingenuous to assume that all fps has to be an average
Technically speed and velocity are 2 different things, but we use speed measurements to refer to current velocity everyday

More importantly the words minimum and maximum shouldnt be used to describe an average
 
Last edited:
not all fps counters are averaged, it is important to know what you are looking at, but equally expressing a frame time as an FPS measurement is not invalid, it is just another way of expressing it

the problem with the hardware pal review / graphs is that showing frame times makes you think he is doing it the proper way, when really his min/max/avg graph is doctored

it is a glaring error and not one I've seen a "reviewer" make before, as I said above, that is not how most reviewers do it anymore

Its not that glaring because he offers both sets of information. After all the max min and avg fps that he is showing is what most users will see if they enable the ingame overlay to display fps. If he only displayed the fraps fps he might be misleading but again he shows both so its not that big of an issue.
 
Back
Top Bottom