"The human eye can't see over 24fps" - Let this myth die, whilst new science reveals some people can identify high framerates

  • Thread starter Thread starter mrk
  • Start date Start date

mrk

mrk

Man of Honour
Joined
18 Oct 2002
Posts
103,123
Location
South Coast

The actual study article is deep linked but for ease:


During the study, the researchers asked a group of 88 volunteers to observe an LED light through a pair of goggles, which they manipulated to flash at different speeds. This test, known as the "critical flicker fusion threshold," allowed the scientists to keep track of the number of flashes per minute, or frequency, at which a person was no longer able to discern the flickering, and instead saw a continuous source of light.

It was discovered that the flicker threshold varied significantly amongst different volunteers, allowing some to see a frequency of up to 60 flashes per second, while others were unable to perceive breaks in a light flashing at just 35 times per second. Furthermore, it was found that each individual’s critical flicker threshold changed relatively little over multiple sessions conducted at the same time on subsequent days.

“We don’t yet know how this variation in visual temporal resolution might affect our day-to-day lives,” said study co-author and PhD candidate Clinton Haarlem, also of Trinity College Dublin. “But we believe that individual differences in perception speed might become apparent in high-speed situations where one might need to locate or track fast-moving objects, such as in ball sports, or in situations where visual scenes change rapidly, such as in competitive gaming.”

The variation in images per second detected by the human volunteers is somewhat similar to those seen in the eyes of closely related members of the animal kingdom, wherein one of the species has developed seperately to hunt faster-moving prey compared to the other.

“This suggests that some people may have an advantage over others before they have even picked up a racquet and hit a tennis ball, or grabbed a controller and jumped into some fantasy world online," concluded Haarlem.
Actual hard science, brilliant. I'm on the higher framerate boat, 30 fps on consoles just feels slow and sluggish, if it's not 100fps on PC or 60fps constant on console then it's not even worth playing in modern times :p

Just like there is a night and day difference to my eyes between 30 and 60fps, the same applies from 60 to 100 fps. Beyond 100fps I find it's just an additional bonus depending on the game, like in HZFW quick reactions with the mouse feel faster/smoother at 139fps than they do at a locked 100fps, technically speaking that makes sense since the frame times are lower at the higher fps, but after 120fps it's mostly diminishing returns in this specific area.
 
Last edited:
It would be interesting to know more about the participants in this test. (Could be more details in the article but I haven’t clicked on it yet)

What were the ages? What activity do they participate in. Is there a difference between people who participate in fast moving activities such as ball sports or gaming with those who don’t.
It's even simpler than that, the study was conducted this way:

One way of measuring this trait is to identify the point at which someone stops perceiving a flickering light to flicker, and sees it as a constant or still light instead. Clinton Haarlem, a PhD candidate at Trinity College Dublin, and his colleagues tested this in 80 men and women between the ages of 18 and 35, and found wide variability in the threshold at which this happened.

The research, published in Plos One, found that some people reported a light source as constant when it was in fact flashing about 35 times a second, while others could still detect flashes at rates of greater than 60 times a second.

This is still some way off the temporal resolution of peregrine falcons, which are able to process roughly 100 visual frames a second.

Haarlem said: “We think that people who see flicker at higher rates basically have access to a little bit more visual information per timeframe than people on the lower end of the spectrum.”
 
It's obvious to me but I rarely understand maths :p - I do tend to visualise everything though, any concepts or methods I can visualise how they work, so if I can visualise the maths then I can understand that fairly easily.
 
Last edited:
Kind of yeah, I want a high baseline fps because it delivers the lowest latency between frames rendered and as such the internal PC latency is lower too. You can observe this with both RTSS and Nvidia overlay enabled showing frametime latency, PC latency and render latency, throw in the Reflex latency stats and you get a very clear picture of how that perception translates into objective reality.

In an ideal world every PC gamer would have a near 0ms pixel response display with at least 120Hz and VRR, at least a 1000Hz polling rate mouse (DPI is irrelevant), and a GPU that can keep up with that combo. This would offer the best possible situation at experiencing smooth and fast frame to input experience.

The exception is that things like Frame Gen will increase the latency on everything but especially mouse input which is where Reflex comes in to try to mitigate that compromise. it typically does a great job but the pre-frame gen framerate must be over 60fps for it to be perceived as good enough otherwise you get that rubber bandy mouse movement experience even if the post-frame gen framerate is showing 100fps. This can be demonstrated quite easily in games like Cyberpunk if path tracing at 4K on a 4090. It will show 100fps or more, but the mouse feeling will be like you're playing at sub 60fps because the baseline fps is exactly that.
 
Last edited:
The funny thing about 360Hz and the like is that unless you're running a game at 1080p then no modern game is going to run anywhere near that to actually be able to leverage that refresh rate assuming it actually was of value for general gaming in the first place. And I'm on about DLSS enabled framerates here, pure raster will be even lower.

A 4090 can't even break 200fps without DLSS Quality in the latest games like Forbidden West, a well optimised game as a whole and even at 3440x1440 I need to be looking at the literal sky only in order to break 200fps otherwise it sits at around the 165fps mark if I disable Gsync.

Oh but what about the next gen cards that are supposed to be 60% faster than a 4090?! Let's assume a 4090 gets 165fps in a modern rastered game using upscaling, a 5090 is only going to get up to 264fps, still 100fps shy of reaching the refresh rate optimum, disable upscaling and go "native" and you're looking at even lower numbers still :p

My view is that display makers should have stopped at 120-144Hz and put all the R&D budget into making the panels themselves absolutely bullet proof. We may have had burn-in proof OLED long ago if they did this instead of chasing big numbers. Imagine having an end-game OLED monitor....

Semi related, but every time I go round to a someone's house or a hotel and they have motion smoothing or whatever that particular manufacturer calls it enabled on the TV...:mad::mad::mad: Like, how do people watch a movie where random scenes look "smooth"?
I call them smegheads in my mind then try to advise then properly only to return another day and the setting is back on :cry:
 
Last edited:
Wasn't it one of the AVATAR movies where some parts were standard cinematic framerate, and others were 60fps? This resulted in a mind**** for those who are attuned to this sort of thing as per the above conversation about motion smoothness on TVs.
 
I don't have a flashing light to hand let alone the same light used by the scientists to control the flash rate. I just know what my eyes can discern on screen especially on oled where the pixel response time is basically 0ms.
 
EDIT: If a game was running a perfect 48 FPS with very low latency from all inputs and outputs, etc. including display update rate I think people would be surprised at the experience - though it still wouldn't be a perfect one for many people, in reality you need a far higher frame rate to achieve something similar to the result.

Ths is not possible with modern games, if it was a very basic engine outputting low bit graphics then this would be fine yes but on a modern graphics heavy title the lower your framerate is the higher your average system latency is and the higher your latency is between rendered frames.

I actually put this to the test just now in Cyberpunk, a game I can view metrics for all of these latency areas and control the framerate however I please. For this FG was disabled, path tracing was disabled but ray tracing was left enabled.

You'll have to download the video and watch locally as I recorded it at 4K 120fps and no online platform will play it back smooth enough. I narrate exactly what is being experienced and observed in each test and I go from locked 120fps to 90, 60 then 48. The motion of the frames rendered on-screen is simply not acceptable for an enjoyable experience on a display that has near 0ms pixel response times (OLED). This was tested on a 1000Hz mouse for reference to remove any input device variable that might attribute to latency observations, and becaus ethe video is recorded at 120fps, you can visibly see the immediate results of motion smoothness and response from mouse movement input.


Be sure to download the video file not play it inside the Google Drive player which won't play it at max resolution or framerate. It is a 2.01GB video.

The gist of this test is to demonstrate that even at 90 fps, compared to 120fps the visible motion is laggier, and there is noticeable input latency introduced since both average system latency and frame to frame latency have gone up. Both of these factors just keep going higher the lower the framerate I lock it to.

Going above 120fps, the noticeable difference is considerably less obvious and you enter the realm of diminishing returns as per the graph posted many posts up. a 120fps framerate provides a perceived real-time experience on all fronts in modern gaming and the actual metrics back this up.
 
Last edited:
I agree if you simplify the variables back to how things once were back in the day, nowadays everything has a contributing factor and the thought process of only needing 48fps or even 60fps just doesn't apply any more because the variables are orders of magnitude higher than what they once used to be. Even going back 10 years or so would be enough to see that difference tbh.

The biggest takeaway here is that average system latency is a blanket figure covering all hardware in the system that Nvidia Frameview detects and registers a value to, the frame to frame latency is displayed by RTSS as the frametime figure. If both of these increase even by a 5ms, you end up with noticeable impact in motion and input performance, and both of these only increase when you decrease the framerate. That's where tech like Frame Generation comes in with the way it inserts new frames between the frame to frame render pipeline which then offsets the increased latency there bringing the latency back down, whilst Reflex then tackles the average system latency side of things to manage that. But in order for this to work the minimum framerate has to be at least 60fps in order to not have a negative impact on the presented experience (as in when using more demanding rendering like path tracing or if a GPU isn't powerful enough for high quality tracing and cannot hit say 90fps).

By understanding how these all work together, you can effectively eliminate any latency woes with either motion or input in any game, but it does mean leaving behind defunct thoughts and practices that simply do not apply in modern times any more, but we all know how hard it is for people to let things go :p
 
Last edited:
Even as an average it still bears relevance though, if it was an exact figure in realtime, the observation and experience would still be the same, just it would be hard to pin an exact number as the variances would be moving too quickly whereas an average of those variances offers a better look at what's happening.

That's why benchmarks always show min, max and average, the average tells a more meaningful story than just seeing what the min or max was :p
 
If the outcome is still the same then the assumption is well grounded, in this context the assumption works as the outcome is the same whether exact or averaged metrics are used for demonstration. It's all contextual really.
 
You can see it, using Frameview, or just look at the average figures via the overlay, the outcome is the same regardless so one or the other makes no difference, figures just add support to what's being observed.
 
Back
Top Bottom