"The human eye can't see over 24fps" - Let this myth die, whilst new science reveals some people can identify high framerates

  • Thread starter Thread starter mrk
  • Start date Start date
That graph is pretty interesting. All things considered, Edison wasn't that far off with his recommended figure of 46 fps for comfortable viewing of motion pictures with no eye strain. It's why 60 Hz monitors and phone screens are 'good enough' for the vast majority of people.
Where they 60hz for a viewing reason or because A/c is ~60hz and it helped with some old school timing or something?
 
24fps is just cinema film fps isn't it?

Yes, and it actually has a purpose. A large part of why 24fps has been retained for cinema is the dreamy escapist feel it bestows on a film. You immediately can tell it isn't "real" and as such ones mind is better able to escape into the story.

This is why HFR in cinema flopped so hard.
 
Yes, and it actually has a purpose. A large part of why 24fps has been retained for cinema is the dreamy escapist feel it bestows on a film. You immediately can tell it isn't "real" and as such ones mind is better able to escape into the story.

This is why HFR in cinema flopped so hard.
"After various testing and experimentation, 24 FPS emerged as the optimum rate - it was the minimum speed that supported good quality sound playback, while also being economical in terms of film stock usage."
 
Where they 60hz for a viewing reason or because A/c is ~60hz and it helped with some old school timing or something?

I think it has more to do with A/C being 60Hz in the USA and the evolution of displays being from TV -> CRT -> LED and it kind of stuck.

Whilst 60Hz has long been a standard, there were a handful of CRTs with high refresh rates. I had a 21" Iiyama CRT for a while which was capable of 100 Hz (or maybe better I can't recall). But it was ridiculously huge and let out a constant high pitched squeal, so I got rid of it.

Yes, and it actually has a purpose. A large part of why 24fps has been retained for cinema is the dreamy escapist feel it bestows on a film. You immediately can tell it isn't "real" and as such ones mind is better able to escape into the story.

It's very subjective and depends on the type of film. Some films definitely benefit from 24 fps for that 'dreamy' feel. Others, less so. Personally I don't like that smeary effect in action sequences that can often happen at 24 fps.
 
"After various testing and experimentation, 24 FPS emerged as the optimum rate - it was the minimum speed that supported good quality sound playback, while also being economical in terms of film stock usage."

Yes, that is why it emerged as the initial standard. It isn't however why we still use it. We don't use film stock any longer for example. I've looked at what multiple directors have said on this subject over the years since HFR has been available to them. The Hobbit came out originally as the first major HFR cinema release. The reason why other directors haven't embraced this is because, as I said, it removes that artistic sense of escapism.
 
Yes, that is why it emerged as the initial standard. It isn't however why we still use it. We don't use film stock any longer for example. I've looked at what multiple directors have said on this subject over the years since HFR has been available to them. The Hobbit came out originally as the first major HFR cinema release. The reason why other directors haven't embraced this is because, as I said, it removes that artistic sense of escapism.

I wonder how much of this is conditioning though, we like 24 fps because it's traditional, what we've always seen and what we expect to see in a film. If history had played out different, might we see 48, or 60 or even 72 fps as 'natural' too?

It's similar to the vinyl vs digital argument. I think most people feel subjectively that vinyl sounds better (unscratched on a good hi-fi) than digital. It's certainly warmer and people like that. But is it actually better? That's a subjective thing and again perhaps there's some degree of conditioning.
 
I wonder how much of this is conditioning though, we like 24 fps because it's traditional, what we've always seen and what we expect to see in a film. If history had played out different, might we see 48, or 60 or even 72 fps as 'natural' too?

It's similar to the vinyl vs digital argument. I think most people feel subjectively that vinyl sounds better (unscratched on a good hi-fi) than digital. It's certainly warmer and people like that. But is it actually better? That's a subjective thing and again perhaps there's some degree of conditioning.

It isn't about 24fps being traditional and it isn't about it being "natural". In fact it is the complete opposite. The human eye can see much more than 24fps and increasing the fps makes the footage seem MORE natural. That is the problem, making it more natural damages the sense of immersion.

This isn't about which is better. It is about artistic vision. HFR video footage is objectively and quantifiably superior. The problem is that directors (pretty much universally) all agree that immersion is important. This is why higher end TVs now have a "FILMMAKER" setting to turn off things like motion smoothing and give people something that is technically inferior but closer to the artistic vision of filmmakers.
 
It isn't about 24fps being traditional and it isn't about it being "natural". In fact it is the complete opposite. The human eye can see much more than 24fps and increasing the fps makes the footage seem MORE natural. That is the problem, making it more natural damages the sense of immersion.

This isn't about which is better. It is about artistic vision. HFR video footage is objectively and quantifiably superior. The problem is that directors (pretty much universally) all agree that immersion is important. This is why higher end TVs now have a "FILMMAKER" setting to turn off things like motion smoothing and give people something that is technically inferior but closer to the artistic vision of filmmakers.

Wrong choice of word by me. By 'natural' I didn't mean realistic but perhaps 'comfortable' or 'easy on the eye'. But yeah, 'immersive' works too.

I don't disagree that 24 fps is more immersive. But I would still suggest that conditioning has a large role to play in this. 24 fps is traditional in the sense it has become a standard for films. It's all anyone sees when they watch a film (with a few exceptions). So we learn it's how a film feels like it should feel. So when a film like The Hobbit goes to a higher fps, it feels too real and jarring. It might be the realism that pops the viewer out of the immersion because they can see extra details and perhaps flaws in costumes and special effects. Perhaps it's also the extra motion detail that makes it a tiring watch, especially on a cinema screen? Our eyes can't focus on the whole screen all at once.

24 fps is just a figure that came about due to what worked best from a pragmatic quality vs cost basis in the 1920s, it's the bare minimum they could get away with without giving everyone headaches. It worked nicely and people stuck with it. Had those early filmmakers been able to do 60 fps easily and cheaply, they might have settled on that instead and everyone would probably have got used to that as the standard reference point.
 
I think it has more to do with A/C being 60Hz in the USA and the evolution of displays being from TV -> CRT -> LED and it kind of stuck.

Whilst 60Hz has long been a standard, there were a handful of CRTs with high refresh rates. I had a 21" Iiyama CRT for a while which was capable of 100 Hz (or maybe better I can't recall). But it was ridiculously huge and let out a constant high pitched squeal, so I got rid of it.



It's very subjective and depends on the type of film. Some films definitely benefit from 24 fps for that 'dreamy' feel. Others, less so. Personally I don't like that smeary effect in action sequences that can often happen at 24 fps.
I know what you mean, goodbye to any desk space if you wanted a decent sized screen. I'm not sure you'd get more than a 24 inch crt on most desks, you probably make the desk collapse as well as having no room.
 
Whilst 60Hz has long been a standard, there were a handful of CRTs with high refresh rates. I had a 21" Iiyama CRT for a while which was capable of 100 Hz (or maybe better I can't recall). But it was ridiculously huge and let out a constant high pitched squeal, so I got rid of it.
My memory is a bit hazy on this subject but I recall most CRT monitors being capable of higher than 60hz refresh rates. 60hz was considered the bare minimum and not ideal for general usage. All the CRT monitors I had from 1996 ran at 75hz.
 
My memory is a bit hazy on this subject but I recall most CRT monitors being capable of higher than 60hz refresh rates. 60hz was considered the bare minimum and not ideal for general usage. All the CRT monitors I had from 1996 ran at 75hz.

Yeah all but basic CRTs did 72-85Hz, some 100. I hated where I was working back then which bought a load of cheap CRTs which only did 60Hz as it was unpleasant to use them for long periods, especially under fluorescent lighting.
 
My memory is a bit hazy on this subject but I recall most CRT monitors being capable of higher than 60hz refresh rates. 60hz was considered the bare minimum and not ideal for general usage. All the CRT monitors I had from 1996 ran at 75hz.

Yeah 60Hz on a cheap CRT was painful. 75Hz and 85Hz were typical for quality CRTs. High end CRTs could do 100 to 120 Hz, a few went higher.

The highest refresh rates weren't usually available at the maximum resolution. If your monitor could do 1600 x 1200 it would run at a lower refresh than at 1024 x 768. So it was always a trade off between resolution and comfort.
 
Yeah 60Hz on a cheap CRT was painful. 75Hz and 85Hz were typical for quality CRTs. High end CRTs could do 100 to 120 Hz, a few went higher.

The highest refresh rates weren't usually available at the maximum resolution. If your monitor could do 1600 x 1200 it would run at a lower refresh than at 1024 x 768. So it was always a trade off between resolution and comfort.

A CRT that did 1600x1200 was pretty high end. I had a 21" CRT for a while that did 1600x1200 and it was impossible to use it for too long, 1280x1024 was much more pleasant on the eyes.
 
Whilst 60Hz has long been a standard, there were a handful of CRTs with high refresh rates. I had a 21" Iiyama CRT for a while which was capable of 100 Hz (or maybe better I can't recall). But it was ridiculously huge and let out a constant high pitched squeal, so I got rid of it.
I had a sony trinitron in the early or mid 2000s
1600x1200 AFAIK, and 100hz (didn't hurt the eyes , but got hot like a radiator )
I think it could do higher hz at lower res
Technically still have it stored in someone elses garage since 20 years or so


anything under 100hz on a crt always looked weird because of how the screen refreshes, it was easy to see the difference between 85 and 100

100 seemed like a still image, 85 didnt


anyone remember back in the day how fluid Quake arena etc used to look? like super smooth, I don't think we even have that with todays high end monitors?
 
Last edited:
A CRT that did 1600x1200 was pretty high end. I had a 21" CRT for a while that did 1600x1200 and it was impossible to use it for too long, 1280x1024 was much more pleasant on the eyes.
I'm sure I had one of a similar size, but I'm sure it did 100hz at 1600x1200 although that was more than 20 years ago. It's just occured to me that the screen size probably isn't too far off how deep it is. Thinking back, everyone was likely closer to crt screens as you couldn't really push them back so smaller screens would have been less of an issue.
 
I think the test is basically flawed for one simple reason... It assumes that a single light source flickering represents how the brain interprets data when it comes to frame rate.

The brain is highly interpretive.. I would happily say a 50hz incandescent lightbulb was not flickering if just staring at it, yet if I game at 120hz then drop the frame rate to 60hz it feels stuttery and awful...

Many studies have been done that show all the mechanisms the brain uses to interpret the raw data so I don't understand the crass over simplification on this test..

A good recent example would be our Samsung fridge/freezer has a filter status light, just staring at it normally you'd think it was orange but blink /move your head around and you will suddenly see its red/green alternating for a couple of iterations then your brain goes back to interpreting it as solid orange.. DLP projectors show how hit and miss people are with the rainbows and again, blinking/moving your head around will suddenly show the raw information (i.e. 'rainbows') which then settles back down to what you think is a stable picture..

On top of that you have the different response/processing rate/focus around peripheral vision and also moving/changing information and how that is processed and how acutely aware of it you are etc..

However, the conclusion of some people are more able to differentiate higher frame rates is obviously true.. but when it comes to how this plays in to gaming or sports then it's only one factor, nerve conduction and response times are made up of many parts of a chain.
 
Last edited:
I'm sure I had one of a similar size, but I'm sure it did 100hz at 1600x1200 although that was more than 20 years ago. It's just occured to me that the screen size probably isn't too far off how deep it is. Thinking back, everyone was likely closer to crt screens as you couldn't really push them back so smaller screens would have been less of an issue.

Mine was on a monitor stand that used to bow just the tinniest bit with the weight of it. It was enormous.
 
A good recent example would be our Samsung fridge/freezer has a filter status light, just staring at it normally you'd think it was orange but blink /move your head around and you will suddenly see its red/green alternating for a couple of iterations then your brain goes back to interpreting it as solid orange.. DLP projectors show how hit and miss people are with the rainbows and again, blinking/moving your head around will suddenly show the raw information (i.e. 'rainbows') which then settles back down to what you think is a stable picture..

The eyes and brain do very strange things in this respect for example https://www.bbc.co.uk/newsround/68129970
 
Back
Top Bottom