"The human eye can't see over 24fps" - Let this myth die, whilst new science reveals some people can identify high framerates

  • Thread starter Thread starter mrk
  • Start date Start date
I had one of those monitors; it was awesome.

Yeah, the only downside I remember - was that they all used those Sony Trinitron CRT tubes, they weighed an absolute ton because they required large magnets on the tube to make them work. I used to take it to LANs all over the country and it was just massive, but it was worth it!
 
We did some testing with 165hz, and 240hz monitors, (Esports studio) we found that almost everybody could tell the difference between 60fps and 120fps - the difference is night and day, however once you get above 165hz into 240hz - it becomes a lot less apparent, I don't remember anyone reliably being able to tell the difference between 165 and 240, any better than chance.

Back in the day, late nineties / early 2000s I used to play a lot of competitive QW, and later into Q3 - A number of us (from what I can remember) bought Iiyma vision master CRTs, which would do 200hz refresh at 800x600 (from memory), and the difference was literally mind blowing, to go from 60hz to 200hz on a CRT - it was like going into a new world.
That makes sense when you look at it (pun intended), going from 16ms to 8 is more noticeable than 8 to 4. I do find certain games are less of an issue with lower frames. Thinking rts, the other downside is certain older ports don't like high refresh rate, for example I played the original dead space on pc, and it breaks the game running it at 60fps compared to 30.
I suspect the main reason for higher refresh rate monitors (above 120 or so) is just to milk gamers :p . I await being flamed and being told that 360 is miles better even when running at 60fps:D.
 
I await being flamed and being told that 360 is miles better even when running at 60fps:D.

It's like anything really - if there's a number involved like "165hz" it's something that people can focus on, then you'll always have someone shouting about how you'll get more headshots, if you increase that number from 165 to 360 - when in reality, learning to relax and not over-play will likely get you way more headshots than any refresh rate on a monitor :)
 
Last edited:
Semi related, but every time I go round to a someone's house or a hotel and they have motion smoothing or whatever that particular manufacturer calls it enabled on the TV...:mad::mad::mad: Like, how do people watch a movie where random scenes look "smooth"?
 
:edit: the "it's diminishing returns" point is obvious to anyone who understands maths, here's a graph for those who don't :p

uXBV2kU.jpeg

Awesome graph - thanks
 
The funny thing about 360Hz and the like is that unless you're running a game at 1080p then no modern game is going to run anywhere near that to actually be able to leverage that refresh rate assuming it actually was of value for general gaming in the first place. And I'm on about DLSS enabled framerates here, pure raster will be even lower.

A 4090 can't even break 200fps without DLSS Quality in the latest games like Forbidden West, a well optimised game as a whole and even at 3440x1440 I need to be looking at the literal sky only in order to break 200fps otherwise it sits at around the 165fps mark if I disable Gsync.

Oh but what about the next gen cards that are supposed to be 60% faster than a 4090?! Let's assume a 4090 gets 165fps in a modern rastered game using upscaling, a 5090 is only going to get up to 264fps, still 100fps shy of reaching the refresh rate optimum, disable upscaling and go "native" and you're looking at even lower numbers still :p

My view is that display makers should have stopped at 120-144Hz and put all the R&D budget into making the panels themselves absolutely bullet proof. We may have had burn-in proof OLED long ago if they did this instead of chasing big numbers. Imagine having an end-game OLED monitor....

Semi related, but every time I go round to a someone's house or a hotel and they have motion smoothing or whatever that particular manufacturer calls it enabled on the TV...:mad::mad::mad: Like, how do people watch a movie where random scenes look "smooth"?
I call them smegheads in my mind then try to advise then properly only to return another day and the setting is back on :cry:
 
Last edited:
This isn't even a debate IMO. Those who don't think we can perceive higher haven't tried it, and don't understand the crucial element that is motion blur.

:edit: the "it's diminishing returns" point is obvious to anyone who understands maths, here's a graph for those who don't :p

uXBV2kU.jpeg

Nice. Makes sense.
 
The funny thing about 360Hz and the like is that unless you're running a game at 1080p then no modern game is going to run anywhere near that to actually be able to leverage that refresh rate assuming it actually was of value for general gaming in the first place. And I'm on about DLSS enabled framerates here, pure raster will be even lower.

Some of us run old games. A good game is still a good game even 25 years later (I'm looking at you, Freespace). I got 180+ fps at 4k with everything cranked on the original Far Cry. That was on my 3090 - it won't work on my 4090 - and I'm not sure of the framerate on my A770 but I'm sure it's over 100 fps. If my 4090 were able to run it I'd probably get near 300 fps. I must break out Darkstar One (space combat game) when I get home and see what fps I get with that.
 
Last edited:
Semi related, but every time I go round to a someone's house or a hotel and they have motion smoothing or whatever that particular manufacturer calls it enabled on the TV...:mad::mad::mad: Like, how do people watch a movie where random scenes look "smooth"?

This. It's seemingly on by default for all TVs but I have no idea how on earth anyone can stand to watch things with it on. It's absolutely horrible watching something where the frame rate changes all the time.

Most people don't even seem to notice or notice the difference which is alarming.

It's interesting to realise just how differently we all perceive the world in terms of how good our individual senses are.
 
This isn't even a debate IMO. Those who don't think we can perceive higher haven't tried it, and don't understand the crucial element that is motion blur.

:edit: the "it's diminishing returns" point is obvious to anyone who understands maths, here's a graph for those who don't :p

uXBV2kU.jpeg

This nicely highlights why it gets harder and harder to tell the difference beyond around 120hz. I start to struggle after that but below that, it's all very obvious.
 
The jury is out with me on this subject. I have no reason to believe the scientists are wrong, but it makes me wonder whether there are other aspects of faster monitors that the manufacturers are not highlighting. I mean, I remember the old CRT screens and to my eyes the only thing that comes close are OLEDs for smoothness.

Anyway, I certainly think that people who strive for hundreds of FPS are fooling themselves.
 
I have no reason to believe the scientists are wrong, but it makes me wonder whether there are other aspects of faster monitors that the manufacturers are not highlighting.

Precisely. The scientists have tested certain specific things. Here's the abstract from the paper:

The critical flicker fusion threshold is a psychophysical measure commonly used to quantify visual temporal resolution; the fastest rate at which a visual system can discriminate visual signals. Critical flicker fusion thresholds vary substantially among species, reflecting different ecological niches and demands. However, it is unclear how much variation exists in flicker fusion thresholds between healthy individuals of the same species, or how stable this attribute is over time within individuals. In this study, we assessed both inter- and intra-individual variation in critical flicker fusion thresholds in a cohort of healthy human participants within a specific age range, using two common psychophysical methods and three different measurements during each session. The resulting thresholds for each method were highly correlated. We found a between-participant maximum difference of roughly 30 Hz in flicker fusion thresholds and we estimated a 95% prediction interval of 21 Hz. We used random-effects models to compare between- and within-participant variance and found that approximately 80% of variance was due to between-individual differences, and about 10% of the variance originated from within-individual differences over three sessions. Within-individual thresholds did not differ significantly between the three sessions in males, but did in females (P<0.001 for two methods and P<0.05 for one method), indicating that critical flicker fusion thresholds may be more variable in females than in males.

They're not, for example, looking at the long term effects of sitting in front of a 30 Hz monitor. There's a lot more to a computer monitor than the critical flicker fusion threshold.
 
Wasn't it one of the AVATAR movies where some parts were standard cinematic framerate, and others were 60fps? This resulted in a mind**** for those who are attuned to this sort of thing as per the above conversation about motion smoothness on TVs.
 
24fps is usually the movie frame rate.
30fps is television and soap operas.
48/60 whatever is what some movies have to reduce motion blur and also what some shoot with their camera.

Of course you see more.
 
Most studies are nonsense anyway.

The human eye can see infinite frame rate. It’s just a question of quantifying, which just leads to denial of every other fact.

The 24 fps is just the point where motion appears natural, and that’s the part that really matters. Therefore, you can also see and feel the effect of a higher actual frame rate even at lower refresh rates.

Having the higher refresh rate is nice as well, but you have to accept all of it.
 
Last edited:
When it comes to things like games there is also the overlooked factor of what works in an ideal situation and what is needed in reality to achieve something close to that - 60Hz/FPS experience with a very low latency panel and wide-range VRR for example is a very different experience to 60Hz where you are having to make a trade-off between input latency and tearing and usually need 100+ FPS to get a reasonable compromise. Same with games and network latency - in an ideal situation with a high quality network link, fast server tickrate and a very good networking model, etc. people will struggle to notice the difference below a certain point, but under real world conditions you often need far lower latency to get anything close to the same experience.

A game with a lot of frame time deviation might require a significantly higher frame rate to achieve the same perception of smoothness.

You also need a game running at around double the lowest FPS that fools the eye into seeing smooth motion to come anywhere close to being responsive enough to input to have a good experience (depending a bit on input device - controller vs mouse, etc.).

EDIT: If a game was running a perfect 48 FPS with very low latency from all inputs and outputs, etc. including display update rate I think people would be surprised at the experience - though it still wouldn't be a perfect one for many people, in reality you need a far higher frame rate to achieve something similar to the result.
 
Last edited:
Back
Top Bottom