To create an average, yes, but by definition the MINIMUM frame rate experienced during the run should be equal to the largest frametime, if you are creating an average over any given whole second, then that isnt the minimum, it is the lowest average whole second
As for hardware pal, I now now that he uses two different methods to record his frametimes and his minmaxavg, his minmaxavg is taken directly from FRAPS, which records one value per second, so it wouldnt matter if the game spent 75% of its time at 30fps, if it just happened to spike to 90fps at the top of every second then the min and average would be around 90fps
No other review site uses FRAPS in this way, because it is so massively innacurate
Hey andybird123 hey all you stunning guys and gals etc
I was a bit busy yesterday with Nvidias new 344.60 driver , which is lacking to say the least. I tried to do my own little tweaks with a few custom profiles and AFR 1/AFR 2 but with no luck.
I will be waiting this out (if Nvidia release a new driver) hopefully by then AMD will release a CFX driver to.
The Frame rate - frame time discussion is something thats been going on for years (the're a few more knowledgable people on the subject than me thats for sure) but I did a small little formula that "kind of" describes playability/smoothness but got a few emails from people not understanding what I mean
"To be able to pass our playability test the 99th percentile of frametimes must average at or lower than 16.7ms , the 0.1 percentile not spiking above 50ms aswell as the game not utilizing all Vram resources. Our own personal experience using these standards ensure no bottlenecks, visual lag or stuttering at 60hz. For some users higher or lower values may be tolerated depending on hardware setups and monitor refresh rates."
This was ofcourse with Shadow of Mordor so each game will feel different depending on your framerate, optimisation and drivers.
I will give a emphasis on *
higher or lower values may be tolerated depending on hardware setups and monitor refresh rates* using an older monitor when getting 50ms frametime spikes (0.1 percentile) I did feel even that split second delay, while with my ASUS PB287Q I didn't.The 50ms ofcourse depends on the framerate your running a drop from 150fps to lets say 50fps will be noticeable while a drop from 40 to 27 might not.
There are to many variables when trying to calculate smoothness I know , Monitor refresh rates, Vsync on/off , monitor response time etc etc.
When mentioning minimum framerate from a percentile basis how would you interpret one graphics card with 30fps as a minimum lets call it "frametime.csv converted framerate"
GPU X
4999 frames rendered at or below 16.7ms (60fps average converted)
1 frame rendered at 33.ms (30fps converted)
Minimum Fps = 30fps
While GPU Y
4500 frames rendered at or below 16.7ms (60fps converted)
300 frames rendered at or below 33.3ms (30fps converted)
Minimum Fps = 30fps
Which GPU is better to get ?
I think you get the idea on how using a minimum of "frametime converted framerate" is imo somewhat flawed.
Theres no 1 size fits all analyses if you don't sit and play the game yourself its all subjective.