• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why bother with anything other than minimums?

Associate
Joined
21 Sep 2010
Posts
455
Am I missing something or are all the other metrics apart from minimums basically irrelevant - and if so, why are minimums the metric we see least? I mean, granted it's nice to be given some idea of the general smoothness of a game, so I can see merit in including average framerates. But surely the thing that affects game play by far the most is those framerate drops below the 30fps mark, where the game becomes visibly choppy. Just like I don't care how about the electronics behind my TV screen so long as the picture is nice, who cares if a graphics card can peak at 100fps or 1000fps so long as it never drops too low? I'd much rather have a card that maintained 50fps constantly than one that averaged 100, peaked at 500 but dropped to 10 on a regular basis.
 
Really not sure what you are on about?

I prefer all my games to run at a minimum framerate of 60fps (I don't care what anyone says, I can "see" & "feel" when the framerate has dropped below that threshold.

I want that framerate to be visually appealing, which means at 1080p, running a medium-high amount of AA and AF, and with as many texture and lighting features turned on/up as possible.

Now, average that out across the games currently on the market, and you'll find that only the higher end cards (or multiple mid range cards) can actually manage this.

Sure, this all comes down to your own personal definition of "minimum", but I'd imagine most people here are similar (or even more stringent) than I am.
 
Again, that would depend on what YOUR requirements from a card are. A review that only focuses on minimums, but for a setup/requirement/software that you have no interest in would be of little benefit.

9 times out of 10, peak framerate and average framerate will directly correlate to minimum drop anyway, hence why most benchmark based reviews will focus on those metrics, as they are easier to measure, and easier to graphically compare (graphically as in with a graph, not with pretty animations).

Whilst they might be the most important thing when actually gaming, they are subjective, and most of all fluctuating based on a number of external parameters, and hence why they won't ever really be the focal point of either comparison discussion, benchmarking, or review.

And as for missing your point, maybe you should actually make a point rather than have people second guess, your OP has no context at all, just a rant about framerate ;)
 
[H] reviews all state minimum frame-rates and general choppiness, as well as a graph showing actual fps changes. I believe several other review sites also do the same.

You're right though, that minimum fps is very important. For example, 5770 crossfire = 5870 in average frame-rates, but tends to be a lot choppier with more fps drops.
 
Really not sure what you are on about?

I prefer all my games to run at a minimum framerate of 60fps (I don't care what anyone says, I can "see" & "feel" when the framerate has dropped below that threshold.

I want that framerate to be visually appealing, which means at 1080p, running a medium-high amount of AA and AF, and with as many texture and lighting features turned on/up as possible.

Now, average that out across the games currently on the market, and you'll find that only the higher end cards (or multiple mid range cards) can actually manage this.

Sure, this all comes down to your own personal definition of "minimum", but I'd imagine most people here are similar (or even more stringent) than I am.

I'm the same. I want to see a min of 60. I can definitely feel it.

Was a bit disappointed to see my current gpu can't manage this all the time. Perhaps I was expecting to much.
 
Because the average does tell you very useful information. If a benchmark, or a real game starts off at 2fps as the level opens, but is 90fps for the rest of the time and in 59 out of 60 seconds is always above 70fps, does listing 2fps represent how the game will feel to you?

Minimum's have their place but without knowing how often you hit those minimums, and when, its as useless as any other metric.

For instance if a 6950 in say, Skyrim later this year will very occasionally dip to 25fps, like for 10 frames a minute, but is mostly about 60fps and perfectly smooth, yet a 580gtx's minimum is 30fps, but it happens for half the frames every minute because its less capable of producing some effect in the game, then it would have a higher minimum but be less smooth and playable.

Or the 580gtx has 30fps mimum, but its also only for 10 frames a minute and the rest are almost identical to say a 6970.

Min, average, max AND a fps/time graph over a benchmark should really be the default standard in information presented in reviews.

I'd take an [H] review with the 4-5 toughest games rather than a techpowerup review with a bunch of 5 year old RTS's, or 3dmark 2003 and the 15 other games they do of which only 4-5 actually are "tough" games for any kind of new card and bear any real relevance to, anything.
 
A lot of games wont go over 60fps anyway, so whats the point having a card that can do 200 when you cant can any higher than the limited 60.
 
"I prefer all my games to run at a minimum framerate of 60fps (I don't care what anyone says, I can "see" & "feel" when the framerate has dropped below that threshold."

This. Paradigm isn't the only one on this point, a game under 60 FPS isn't the same as a game above it, there are clearly visual differences other then that I fail to see your point.
 
Different people have different perceptions so while one person can 'feel' when the fps drops below 60, others may not. Min fps is however, a very useful indication of a cards performance and why I include it in my reviews;). There are times when fps will drop massively for a very short time though so min fps has to be taken with a pinch of salt, it isn't meant to say the card is dropping to that level all of the time, just the odd occassion. Take for example Metro 2033. There are times when this will crucify even the fastest card where fps drops below 10fps while for the majority it's in the 30's.
 
Because the average does tell you very useful information. If a benchmark, or a real game starts off at 2fps as the level opens, but is 90fps for the rest of the time and in 59 out of 60 seconds is always above 70fps, does listing 2fps represent how the game will feel to you?

Minimum's have their place but without knowing how often you hit those minimums, and when, its as useless as any other metric.

For instance if a 6950 in say, Skyrim later this year will very occasionally dip to 25fps, like for 10 frames a minute, but is mostly about 60fps and perfectly smooth, yet a 580gtx's minimum is 30fps, but it happens for half the frames every minute because its less capable of producing some effect in the game, then it would have a higher minimum but be less smooth and playable.

Or the 580gtx has 30fps mimum, but its also only for 10 frames a minute and the rest are almost identical to say a 6970.

Min, average, max AND a fps/time graph over a benchmark should really be the default standard in information presented in reviews.

I'd take an [H] review with the 4-5 toughest games rather than a techpowerup review with a bunch of 5 year old RTS's, or 3dmark 2003 and the 15 other games they do of which only 4-5 actually are "tough" games for any kind of new card and bear any real relevance to, anything.
you are missing the point.

FPS -> frame per second.

by definition, it should be average number of frames recorded over that second, so those unless those 10 stuttering frames are within 1 second, the minimal figure should should not be affected.

and if those 10 slow frames are within 1 second, then you WILL notice the stutter and you WILL be affected during gameplay.



i support OP's way of looking at things, and that's why i usually only read Bit-tech.net reviews. they rank cards by their minimal FPS, average of recorded over multiple benchmark runs. best way of benchmarking.

btw, they are by no means nvidia biased, look at how they slated 480. it's just ranked by minimal, Fermi currently have the edge across the board, there's no denying.
 
I agree with you. I'd sooner play a game at 30fps than a game that bounces from 30-60+. It's one of the reasons people say a quad-core CPU isn't much better than a dual-core because benchmarks only get a few FPS more even though in the real world, it can help stop the drops in frame-rate but on paper, it's doesn't really show.
 
For some reason people seem to be confusing the issue of "what average framerate is acceptable to me"/"what framerate can I perceive choppiness at" with the completely different issue of "should benchmarks focus on minimums or averages/maximums". These are totally different issues, and my question is about the second one.

To the guy that said I should make a point/make it clearer so he doesn't miss it, I'm afraid I can't be bothered to put together crayon drawings to illustrate what I think is a pretty straightforward distinction.

Somebody mentioned that the frames at the startup might skew the minimums. Surely it isn't beyond human ingenuity to begin measuring a couple of seconds after the benchmark has loaded, in order to allow us to focus on minimums. It's more important to me that a game doesn't drop below 30 fps (or 40 or 60 depending on preference) than that it averages 80 or 100fps despite getting choppy sometimes because it has really high peak performance.
 
Although we do see the minimums and averages presented in some reviews, I would like to see them a bit more often. When I do pay attention to a review, that is what I tend to look at more closely than the maximums.
 
Last edited:
i ran the dirt 2 benchtest, got 34fps average and 57fps max (Vsync on). so you would think by this the game would be decent. but no, it gets a minimum of 12fps, and quite often. not for long periods but enough to make the game stutter. so, i also think the minimum should be taken into account a lot more.
 
To the guy that said I should make a point/make it clearer so he doesn't miss it, I'm afraid I can't be bothered to put together crayon drawings to illustrate what I think is a pretty straightforward distinction.

Great, but try asking an actual question rather than going off on some childish rant about frame rate.

Hell, your original post could have simply been about there being no point in high-end cards, or the spending habits thereof.

If you want intelligent debate, try being intelligent.

I'm afraid I can't be bothered

If you can't be bothered asking the question properly, why should anyone be bothered to reply?
 
This is why I want to see more reviews with useful data like % of time spent at minimum fps, % of time spent below 30fps and a metric for how consistant the frame update rate actually is.

I did produce a tool for making these kinda graphs an example readout here:

http://aten-hosted.com/images/fpsexample.jpg

but theres some issues with the underlying maths in the programming which are beyond my ability to fix without a lot of researching as I'm not familiar with the concepts used to calculate certain aspects.

This one annoys me quite a bit as I see people hyping up a GPU or multi GPU setup based on the average and max fps numbers on a benchmark as being better than an alternative - but I know from my testing when producing this program that the alterntiave is actually the better as it has (a) less time spent at minimum fps (b) smoother more consistant frame update rates even tho it doesn't hit the same max fps.
 
Last edited:
You can get average hardware that will run anything on high these days, consoles have held progress back when it comes to graphics, so im wondering when we will start to see elements of ray tracing to really push things forward.
 
Back
Top Bottom