"The human eye can't see over 24fps" - Let this myth die, whilst new science reveals some people can identify high framerates

  • Thread starter Thread starter mrk
  • Start date Start date

mrk

mrk

Man of Honour
Joined
18 Oct 2002
Posts
103,467
Location
South Coast

The actual study article is deep linked but for ease:


During the study, the researchers asked a group of 88 volunteers to observe an LED light through a pair of goggles, which they manipulated to flash at different speeds. This test, known as the "critical flicker fusion threshold," allowed the scientists to keep track of the number of flashes per minute, or frequency, at which a person was no longer able to discern the flickering, and instead saw a continuous source of light.

It was discovered that the flicker threshold varied significantly amongst different volunteers, allowing some to see a frequency of up to 60 flashes per second, while others were unable to perceive breaks in a light flashing at just 35 times per second. Furthermore, it was found that each individual’s critical flicker threshold changed relatively little over multiple sessions conducted at the same time on subsequent days.

“We don’t yet know how this variation in visual temporal resolution might affect our day-to-day lives,” said study co-author and PhD candidate Clinton Haarlem, also of Trinity College Dublin. “But we believe that individual differences in perception speed might become apparent in high-speed situations where one might need to locate or track fast-moving objects, such as in ball sports, or in situations where visual scenes change rapidly, such as in competitive gaming.”

The variation in images per second detected by the human volunteers is somewhat similar to those seen in the eyes of closely related members of the animal kingdom, wherein one of the species has developed seperately to hunt faster-moving prey compared to the other.

“This suggests that some people may have an advantage over others before they have even picked up a racquet and hit a tennis ball, or grabbed a controller and jumped into some fantasy world online," concluded Haarlem.
Actual hard science, brilliant. I'm on the higher framerate boat, 30 fps on consoles just feels slow and sluggish, if it's not 100fps on PC or 60fps constant on console then it's not even worth playing in modern times :p

Just like there is a night and day difference to my eyes between 30 and 60fps, the same applies from 60 to 100 fps. Beyond 100fps I find it's just an additional bonus depending on the game, like in HZFW quick reactions with the mouse feel faster/smoother at 139fps than they do at a locked 100fps, technically speaking that makes sense since the frame times are lower at the higher fps, but after 120fps it's mostly diminishing returns in this specific area.
 
Last edited:
'30 is fine' regurgitated by console owners huffing copium since, well, forever.

I struggle 120 onwards but 30/60 is night and day, one feels like playing in treacle.
 
Last edited:
It would be interesting to know more about the participants in this test. (Could be more details in the article but I haven’t clicked on it yet)

What were the ages? What activity do they participate in. Is there a difference between people who participate in fast moving activities such as ball sports or gaming with those who don’t.
 
this study is less about framerate more about backlight flicker sensitivity
if the flashing LED was moving and they captured at what rate it would just look like a solid light moving... I bet it would be in hundreds
 
Last edited:
It would be interesting to know more about the participants in this test. (Could be more details in the article but I haven’t clicked on it yet)

What were the ages? What activity do they participate in. Is there a difference between people who participate in fast moving activities such as ball sports or gaming with those who don’t.
It's even simpler than that, the study was conducted this way:

One way of measuring this trait is to identify the point at which someone stops perceiving a flickering light to flicker, and sees it as a constant or still light instead. Clinton Haarlem, a PhD candidate at Trinity College Dublin, and his colleagues tested this in 80 men and women between the ages of 18 and 35, and found wide variability in the threshold at which this happened.

The research, published in Plos One, found that some people reported a light source as constant when it was in fact flashing about 35 times a second, while others could still detect flashes at rates of greater than 60 times a second.

This is still some way off the temporal resolution of peregrine falcons, which are able to process roughly 100 visual frames a second.

Haarlem said: “We think that people who see flicker at higher rates basically have access to a little bit more visual information per timeframe than people on the lower end of the spectrum.”
 
This isn't even a debate IMO. Those who don't think we can perceive higher haven't tried it, and don't understand the crucial element that is motion blur.

:edit: the "it's diminishing returns" point is obvious to anyone who understands maths, here's a graph for those who don't :p

uXBV2kU.jpeg
 
Last edited:
It's obvious to me but I rarely understand maths :p - I do tend to visualise everything though, any concepts or methods I can visualise how they work, so if I can visualise the maths then I can understand that fairly easily.
 
Last edited:
Actual hard science, brilliant. I'm on the higher framerate boat, 30 fps on consoles just feels slow and sluggish,

There's also ergonomics: they're testing short term response but people find high refresh rates easier on the eye over longer periods, as anyone who had to labour with 60 Hz CRTs will recall.
 
Read this earlier, surprising that some people can't perceive some slow flashing, but what i think that actually looks like and what it is could be quite different.

I don't think this even gives any results we can draw solid conclusions from, but yet the FPS circle jerk will no doubt lap this up to justify expensive purchases, or use as an excuse for losing a game.

A more interesting study would be if age is of a discerning factor compared to natural ability and/or practise.
 
It would be interesting to know more about the participants in this test. (Could be more details in the article but I haven’t clicked on it yet)

What were the ages? What activity do they participate in. Is there a difference between people who participate in fast moving activities such as ball sports or gaming with those who don’t.

Been involved in ball sports since a youngster, they told me it would make me blind, not make me a pro gamer :D
 
It's a lot more to do with the input and how this translates to motion as opposed to solely what fps the eye is seeing.

You can test this yourself as a controller will feel smoother than a high sensitivity mouse in the same game at the same fps.

People want high fps for perceived fluidity of motion as the higher numbers allow for smoother movements.
 
Kind of yeah, I want a high baseline fps because it delivers the lowest latency between frames rendered and as such the internal PC latency is lower too. You can observe this with both RTSS and Nvidia overlay enabled showing frametime latency, PC latency and render latency, throw in the Reflex latency stats and you get a very clear picture of how that perception translates into objective reality.

In an ideal world every PC gamer would have a near 0ms pixel response display with at least 120Hz and VRR, at least a 1000Hz polling rate mouse (DPI is irrelevant), and a GPU that can keep up with that combo. This would offer the best possible situation at experiencing smooth and fast frame to input experience.

The exception is that things like Frame Gen will increase the latency on everything but especially mouse input which is where Reflex comes in to try to mitigate that compromise. it typically does a great job but the pre-frame gen framerate must be over 60fps for it to be perceived as good enough otherwise you get that rubber bandy mouse movement experience even if the post-frame gen framerate is showing 100fps. This can be demonstrated quite easily in games like Cyberpunk if path tracing at 4K on a 4090. It will show 100fps or more, but the mouse feeling will be like you're playing at sub 60fps because the baseline fps is exactly that.
 
Last edited:
This isn't even a debate IMO. Those who don't think we can perceive higher haven't tried it, and don't understand the crucial element that is motion blur.

We did some testing with 165hz, and 240hz monitors, (Esports studio) we found that almost everybody could tell the difference between 60fps and 120fps - the difference is night and day, however once you get above 165hz into 240hz - it becomes a lot less apparent, I don't remember anyone reliably being able to tell the difference between 165 and 240, any better than chance.

Back in the day, late nineties / early 2000s I used to play a lot of competitive QW, and later into Q3 - A number of us (from what I can remember) bought Iiyma vision master CRTs, which would do 200hz refresh at 800x600 (from memory), and the difference was literally mind blowing, to go from 60hz to 200hz on a CRT - it was like going into a new world.
 
A number of us (from what I can remember) bought Iiyma vision master CRTs, which would do 200hz refresh at 800x600 (from memory), and the difference was literally mind blowing, to go from 60hz to 200hz on a CRT - it was like going into a new world.

I had one of those monitors; it was awesome.
 
Back
Top Bottom