"The human eye can't see over 24fps" - Let this myth die, whilst new science reveals some people can identify high framerates

  • Thread starter Thread starter mrk
  • Start date Start date
It's hard to compare films / cinematic video shot at 24 fps with video games at high fps. 24 fps films / cinematic video are capturing the motion of real world objects or have animation / special effects tailored to 24 fps. But a video game running a high frame rate could be impacted by animation quality, art style, camera behaviour, latency, micro stutter and any number of graphics settings.

Which is why there's games that barely reach 50 fps that can actually appear smoother and easier on the eye than games that reach 200 fps on the same hardware.

75 fps seems to be the sweet spot for my eyes, with 90 fps seeming to be the point after which I can't any detect significant change. So, even though I have a 165Hz monitor, I tend to cap games at somewhere between 75 and 90 fps even if a game can hit higher. Saves thrashing the GPU too.
 
It's hard to compare films / cinematic video shot at 24 fps with video games at high fps. 24 fps films / cinematic video are capturing the motion of real world objects or have animation / special effects tailored to 24 fps. But a video game running a high frame rate could be impacted by animation quality, art style, camera behaviour, latency, micro stutter and any number of graphics settings.

Which is why there's games that barely reach 50 fps that can actually appear smoother and easier on the eye than games that reach 200 fps on the same hardware.

75 fps seems to be the sweet spot for my eyes, with 90 fps seeming to be the point after which I can't any detect significant change. So, even though I have a 165Hz monitor, I tend to cap games at somewhere between 75 and 90 fps even if a game can hit higher. Saves thrashing the GPU too.

I do the same. For me, the critical "upgrade" was OLED, not frame-rate. OLED is just so much smoother. As for frame-rate, I usually cap it at 90.
 
The whole human brain can't tell over 24 fps thing was always BS.

Forgive the pun but I knew it wasn't true as I had seen it with my own eyes.

For me I can tell up to about 100fps, I can tell a big difference between 60 and 100.

I'd like to say I can "feel" a difference between 100 and 144 but, honestly I'd not be confident in passing a blind test.

The other thing in particular in gaming with fast paced games, is FPS goes behind everything, input lag responsiveness etc etc, so a lot of it may be "feel" over just what you can distinguish on purely a visual basis.

So whilst I'd 100% believe anyone who says they can tell the difference between say 150 and 200 FPS whilst gaming, I'd be interested to see of they were just watching a hands off demo, if they could still tell the difference.
 
This. I don't even know where it came from as its clearly false and never had any validity.

It's why I am very cynical about anything I read so called "Experts or Scientists" is meaningless.

You'll probably find the "scientists" that came up with the originally were being funded by the BBC or some similar and 24 fps just happened to be the same as the TV broadcasts.
 
Last edited:
On a semi related note, I use lasers for work and you can send small pulses to them. You can comfortably see a pulse that last just 50 micro seconds :p . I know it's not the same as frame rate, but thought it was interesting. With regards to the smoothness (think my old calls it trumotion) I have to turn it off as it just looks wrong. I have been a bit naughty and turned it off at my parents too:cry:.
 
  • Haha
Reactions: TNA
You'll probably find the "scientists" that came up with the originally were being funded by the BBC or some similar and 24 fps just happened to be the same as the TV broadcasts.

So what do you trust, YouTube and TikTok? Do agree on 'experts' though, so many news stories say 'experts have predicted.' Oh right, who are they then?
 
Last edited:
The 24fps thing is the number claimed that people will see moving images instead of a fast slide show.

All this talk has me reminiscing of the crt motion clarity. I'm still chasing it and am intrigued to try out the new 480hz 1080p oled mode coming later this year.

Obviously overall oled is better than crt, but the need to have much higher fps for the same motion clarity is a real downside. If we could emulate it better whilst keeping brightness there wouldn't be such a need for 100+ framerate.
 
This. I don't even know where it came from as its clearly false and never had any validity.
I suspect it was seeded by games makers who could not get the framerate any higher. (i remember playing F1GP on the amiga at 8fps and as i didnt know better it was playable)

even watching TV i notice jitter on some fast panning shots however in recent times playing games , esp racing games i physically struggle with nausea now at 30fps, where as i am absolutely fine at 60fps.

beyond that i think it is a case of diminishing returns, but that is just me and i would never say anyone is wrong if they find over 100fps better.
 
Last edited:
The myth about the eye not being able to see over 24 fps is hard to find the exact origin of, but seems linked to a misunderstanding of the reason 24 fps was chosen by the early film industry.

The view that 24 fps is the minimum needed for humans to see smooth motion instead of an uncomfortable slide show, stems from Edison in the 1920s. But it's largely an assumption based on subjective tests rather than precise scientific fact.

But Edison didn't actually come up with the figure of 24 fps. What Edison actually said that 46 fps was what was needed. Again, this was a recommendation, not scientific fact.

The 24 fps figure comes from technical limitations at the time. Films were put through a projector with two-bladed shutter at a speed of 24 fps. But each frame is shown twice, so this doubles up to 48 fps, to satisfy Edison's recommendations. This was superseded by three-blade projectors which showed each frame three times, so 72 fps.

So, as you see, no-one ever really said that 24 fps was a maximum, that's a myth from misunderstanding. In fact 24 fps was seen as a bare minimum.

 
My eyes have consistently told me that anything above 60fps is wasted on them. So long as it doesn't drop below this I am perfectly happy. My current monitor does 165Hz I think but it really doesn't matter to me.

I see this as an advantage as I will never need as much GPU/CPU horsepower as sensitive eye peeps.

I also don't play fast paced FPS stuff
 
I suspect it was seeded by games makers who could not get the framerate any higher. (i remember playing F1GP on the amiga at 8fps and as i didnt know better it was playable)

even watching TV i notice jitter on some fast panning shots however in recent times playing games , esp racing games i physically struggle with nausea now at 30fps, where as i am absolutely fine at 60fps.

beyond that i think it is a case of diminishing returns, but that is just me and i would never say anyone is wrong if they find over 100fps better.

I played some early 3D games (proper 3D not stuff like Doom) on an ATI Rage II+ 3D at like 9 FPS and at the time it was "playable" but even then I knew it wasn't great - but there is no way I could go back to playing like that now. I went out and bought a Voodoo 1 as soon as I could.
 

The actual study article is deep linked but for ease:



Actual hard science, brilliant. I'm on the higher framerate boat, 30 fps on consoles just feels slow and sluggish, if it's not 100fps on PC or 60fps constant on console then it's not even worth playing in modern times :p

Just like there is a night and day difference to my eyes between 30 and 60fps, the same applies from 60 to 100 fps. Beyond 100fps I find it's just an additional bonus depending on the game, like in HZFW quick reactions with the mouse feel faster/smoother at 139fps than they do at a locked 100fps, technically speaking that makes sense since the frame times are lower at the higher fps, but after 120fps it's mostly diminishing returns in this specific area.

Have you tested yourself with a flashing light?from skimming the study graphs 53/ish seems to be the drop off?
 
Last edited:
I don't have a flashing light to hand let alone the same light used by the scientists to control the flash rate. I just know what my eyes can discern on screen especially on oled where the pixel response time is basically 0ms.
 
Have you tested yourself with a flashing light?from skimming the study graphs 53/ish seems to be the drop off?

That graph is pretty interesting. All things considered, Edison wasn't that far off with his recommended figure of 46 fps for comfortable viewing of motion pictures with no eye strain. It's why 60 Hz monitors and phone screens are 'good enough' for the vast majority of people.
 
EDIT: If a game was running a perfect 48 FPS with very low latency from all inputs and outputs, etc. including display update rate I think people would be surprised at the experience - though it still wouldn't be a perfect one for many people, in reality you need a far higher frame rate to achieve something similar to the result.

Ths is not possible with modern games, if it was a very basic engine outputting low bit graphics then this would be fine yes but on a modern graphics heavy title the lower your framerate is the higher your average system latency is and the higher your latency is between rendered frames.

I actually put this to the test just now in Cyberpunk, a game I can view metrics for all of these latency areas and control the framerate however I please. For this FG was disabled, path tracing was disabled but ray tracing was left enabled.

You'll have to download the video and watch locally as I recorded it at 4K 120fps and no online platform will play it back smooth enough. I narrate exactly what is being experienced and observed in each test and I go from locked 120fps to 90, 60 then 48. The motion of the frames rendered on-screen is simply not acceptable for an enjoyable experience on a display that has near 0ms pixel response times (OLED). This was tested on a 1000Hz mouse for reference to remove any input device variable that might attribute to latency observations, and becaus ethe video is recorded at 120fps, you can visibly see the immediate results of motion smoothness and response from mouse movement input.


Be sure to download the video file not play it inside the Google Drive player which won't play it at max resolution or framerate. It is a 2.01GB video.

The gist of this test is to demonstrate that even at 90 fps, compared to 120fps the visible motion is laggier, and there is noticeable input latency introduced since both average system latency and frame to frame latency have gone up. Both of these factors just keep going higher the lower the framerate I lock it to.

Going above 120fps, the noticeable difference is considerably less obvious and you enter the realm of diminishing returns as per the graph posted many posts up. a 120fps framerate provides a perceived real-time experience on all fronts in modern gaming and the actual metrics back this up.
 
Last edited:
Back
Top Bottom