"The human eye can't see over 24fps" - Let this myth die, whilst new science reveals some people can identify high framerates

  • Thread starter Thread starter mrk
  • Start date Start date
Ths is not possible with modern games, if it was a very basic engine outputting low bit graphics then this would be fine yes but on a modern graphics heavy title the lower your framerate is the higher your average system latency is and the higher your latency is between rendered frames.

This is the thing - some people say you don't need more than X or Y and they might be right on paper, but the imperfect world of hardware, OS software and game software, etc. means you need a far higher amount to replicate the same results as what may be possible with a perfect environment.
 
I agree if you simplify the variables back to how things once were back in the day, nowadays everything has a contributing factor and the thought process of only needing 48fps or even 60fps just doesn't apply any more because the variables are orders of magnitude higher than what they once used to be. Even going back 10 years or so would be enough to see that difference tbh.

The biggest takeaway here is that average system latency is a blanket figure covering all hardware in the system that Nvidia Frameview detects and registers a value to, the frame to frame latency is displayed by RTSS as the frametime figure. If both of these increase even by a 5ms, you end up with noticeable impact in motion and input performance, and both of these only increase when you decrease the framerate. That's where tech like Frame Generation comes in with the way it inserts new frames between the frame to frame render pipeline which then offsets the increased latency there bringing the latency back down, whilst Reflex then tackles the average system latency side of things to manage that. But in order for this to work the minimum framerate has to be at least 60fps in order to not have a negative impact on the presented experience (as in when using more demanding rendering like path tracing or if a GPU isn't powerful enough for high quality tracing and cannot hit say 90fps).

By understanding how these all work together, you can effectively eliminate any latency woes with either motion or input in any game, but it does mean leaving behind defunct thoughts and practices that simply do not apply in modern times any more, but we all know how hard it is for people to let things go :p
 
Last edited:
Keep in mind, all these frame time and even latency figures are in fact given as an average.

That’s just the nature of quantification of natural occurrences. Notice I said nature twice above, this was meant entirely in context.

This is because for quantification of time you need space, and for quantification of space you need time.

It’s all relative. ;)
 
Last edited:
Even as an average it still bears relevance though, if it was an exact figure in realtime, the observation and experience would still be the same, just it would be hard to pin an exact number as the variances would be moving too quickly whereas an average of those variances offers a better look at what's happening.

That's why benchmarks always show min, max and average, the average tells a more meaningful story than just seeing what the min or max was :p
 
There’s an assumption here that there’s a perfect relationship.

This is never the case.

However, I still entirely agree, it’s always best to keep it relevant.
 
If the outcome is still the same then the assumption is well grounded, in this context the assumption works as the outcome is the same whether exact or averaged metrics are used for demonstration. It's all contextual really.
 
You can see it, using Frameview, or just look at the average figures via the overlay, the outcome is the same regardless so one or the other makes no difference, figures just add support to what's being observed.
 
It's why I am very cynical about anything I read so called "Experts or Scientists" is meaningless.

You'll probably find the "scientists" that came up with the originally were being funded by the BBC or some similar and 24 fps just happened to be the same as the TV broadcasts.

Experts and Science have nothing to do with the 24fps myth. It was people misunderstanding the science and the experts.
 
Gemini man was 120fps at the cinema. My impression of hfr movies is it makes it easier to see it is just actors on a set. Its more obvious it is a set, and the acting isn't as believable. The high frame rate is closer to real life, and makes it much easier to see the imperfections.
 
Never heard that myth to be fair, Always was under the impression it was around the 200 mark for most people, Highly skilled fighter pilots ect were 300ish mark
 
Back
Top Bottom