Is it worth going over 144Hz?

The fundamental problem you face with trying to explain what you see is that the motion is caused because the image is retained on your eye and it’s fixed there for roughly 1/24th of a second and then you see the next image, your brain fills in the gap as motion. You can’t shorten that 1/24th of a second so you can refresh the image as fast as you like but until 1/24th of a second has passed you can’t see another image. If what you are saying is correct then the OP would not be asking the question. Instead people are saying they can’t see any benefit over this frame rate or that frame rate and it’s simply not possible.

It’s literally mass hysteria. You have all convinced yourselves that you can see this smoothness when it’s literally physically impossible.

Persistence of vision lasts for about 20ms, but that doesn't mean the eye/brain can't take in new information during that period - it has been proven people can pick out information from a single frame lasting only 13ms in a stream of images, etc. detect flicker in a constant light at up to 100-120Hz and rapid pulsed lighting at the nanosecond scale.
 
But it's a key element of this discussion. The fact that higher FPS increases the information available and this the smoothness (because we are able to perceive it) is something you're denying. Do you see no difference between them in this video?
What I can or can’t see in a video specifically designed to make me see something (like your previous motion blur gif) isn’t relevant. Please tell me how you can see things faster than the Mk 1 eyeball can perceive them? You can’t, of you would have. what you are left with is “I saw a miracle” and you’ve convinced others they see it too. Mass hysteria. Calm down. The truth is out there.
 
Persistence of vision lasts for about 20ms, but that doesn't mean the eye/brain can't take in new information during that period - it has been proven people can pick out information from a single frame lasting only 13ms in a stream of images, etc. detect flicker in a constant light at up to 100-120Hz and rapid pulsed lighting at the nanosecond scale.
But if the image persists then how you can see a refresh? Simple answer - you can’t. Your brain is fooling you, just as it fools you into thinking it sees motion.
 
You can’t shorten that 1/24th of a second so you can refresh the image as fast as you like but until 1/24th of a second has passed you can’t see another image.
Again, not refresh rate related.
But it is related - you've stated that we can't see images faster than 1/24 of a second? so if that's the case why is there an obvious difference in the video when 30 and 60 fps are shown side by side?

Do you see no difference between them in this video?
Until this question is answered, the rest of his argument is irrelevant

Please tell me how you can see things faster than the Mk 1 eyeball can perceive them? You can’t, of you would have. what you are left with is “I saw a miracle” and you’ve convinced others they see it too. Mass hysteria. Calm down. The truth is out there.
I'm so glad I've seen a miracle then. The truth is definitely out there, and unfortunately the Earth is no longer flat. I suggest you actually try a >60hz monitor, or indeed just watch the above video - the differences are painfully obvious, regardless of whether "the science" (or rather your intepretation of it) supports it or not
 
The point here is that you are the one misunderstanding the fundamental nature of why 24fps content exists. The 24fps standard is the minimum that was considered to display pictures. Not the maximum. This isn't an upper limit, due to persistence of vision, quite conversely the nature of the human eye/brain having persistence of vision is what makes 24fps acceptable at all. It is the lowest acceptable frame rate.

It's similar enough for analogy (but not identical of course) to how many kbps you can rip a music track at for most people and still retain enough information for most people to hear an acceptable amount of the audio. For example, I remember ripping at 64kbps and being happy enough as I could still hear the song. It wasn't until I realised you could change from the default in WMP that I realised how much data was missing when comparing use of 128+kbps. That's because for me, 64kbps was just about enough to not make me gross out at the sound.

This is obviously an analogy, and intended to assist in explaining a similar concept outside of the context we've been discussing it within. If you come back with "but this isn't the same thing" then I really don't know where to go with you.

This fundamental misunderstanding is blocking you from being able to even engage in normal discussion around this, even when presented with what I think is pretty reasonable to say is incontrovertible evidence. The only one acting like a hysterical religious zealot is you.
 
But if the image persists then how you can see a refresh? Simple answer - you can’t. Your brain is fooling you, just as it fools you into thinking it sees motion.

That is persistence without new information, just because your eye and more pertinently brain can retain an image for a certain length of time doesn't mean it can't take in new information within that time.
 
But if the image persists then how you can see a refresh? Simple answer - you can’t. Your brain is fooling you, just as it fools you into thinking it sees motion.

Seems you are assuming a display is some sort of perfect digital device instead of a flawed mechanical representation of reality.

If I look out of the window at a ball moving across my vision then different rods and cones are excited in my eye as the ball moves in my 'frame' of vision. My eye is stimulated by a constant stream of photons from the object in each 'persistence' period.
My brain does lots of smart magic and I 'see' an object moving. I can also do smart things like move my eyeballs or head to track which allows my brain to provide a better focused 'image' and predict the path of the object.
In actuality our eyeballs are darting about all over the place constantly to provide a composite image.... but that's another level of complexity.

Now compare this to a screen.
A backlight light is pulsed for a period dependant on base brightness setting. On a low end monitor this could be 100 times/sec, on a high end 400+
Between each pulse of light, the mechanical shutter on each pixel is subject to a voltage and then does it's best to get into the correct position in time so that the photons sent to my eye at the next flash of light are the 'correct' ones.
This cannot happen instantly for all pixels on the panel so it takes time to refresh the screen before the backlight strobes. More expensive panels and driver electronics reduce the time needed to make the change.

My eye therefore sees a number of photon snapshots it stitches together rather than a constant steam of photons, and each snapshot is also corrupted by the errors in the display depending on how quickly the LCD can respond.
Depending on alignment of my visual 'persistence' , the backlight pulse and accuracy LCD shutters vs target at the moment I get a several doses of photons blasted into my eye during each persistence period, all a little bit different from the last in a moving image.

Looking through a window and looking at a monitor are not the same thing from the perspective of stimulating the eye, the brain perceives motion but it does not have the same subtle stimulation from the constant stream of photons.
A monitor refreshing at 180Hz better replicates motion of an on screen object vs looking out of a window than a 30Hz or 60Hz monitor as it stimulates the eye in a way more consistent with reality.
I.e. with multiple smaller pulses of photons activating differing groups of rods and cones to provide the composite image rather than a few buckets of them.

This is the reason VR at 72Hz feels a lot worse and can induce more nausea than VR at 120Hz.
While the 'frames' we can 'see' may be capped quite low, the motion the eye can perceive is much higher and we naturally feel when something is wrong.

The in perceived smoothness between my old 32" 75Hz monitor and the replacement 165/180Hz screen is easily detectable in racing, close action flying games etc. Less so in FPS at least for me.
There is still blur at speed but things like signage at the track edge, the graduation between kerbs look better defined, I can read signage set back from the track that I can't at lower refresh rates though closer to the track it is still blurred as the pixel 'movement' per frame is just too high.

I've gamed at 60-75 fps for decades, sometimes < 50fps like with my initial run through Cyberpunk and I've never had an issue with it, but I have noticed differences instantly when playing games I know well on a friends PC with a high end monitor at a high refresh so I do agree with the posters who say they can feel a difference... because I can too.
 
The elephant in the room here is biological variation. I notice people with no biology background assume we are all the same. If that was the case I would be able to run a sub 10s 100m and lift 500Kg. I can easily believe some people will see no difference whilst other will. After all I don't hear coil whine whilst others have the hearing of a bat and it annoys them intensely! Some people get travel sick, have no fear of heights and can actually hit a ping pong ball :cry: A trained twitch FPS gamer will notice things my 77 year old mother won't ;)

The point of this is everyone is probably right, they can or can't see/feel the difference it doesn't mean someone else is wrong.
 
To be perfectly honest I went from a 1440p IPS 60Hz screen to a 1440p IPS 165hz screen with Freesync and for whatever reason I can't really say I see the benefit of a higher refresh monitor, games look and feel exactly the same as they did on my old monitor. Maybe I need to go back to 60Hz to experience the difference but hand on heart out the box games don't' seem any smoother to me.
 
To be perfectly honest I went from a 1440p IPS 60Hz screen to a 1440p IPS 165hz screen with Freesync and for whatever reason I can't really say I see the benefit of a higher refresh monitor, games look and feel exactly the same as they did on my old monitor. Maybe I need to go back to 60Hz to experience the difference but hand on heart out the box games don't' seem any smoother to me.

Are you sure you are actually running the 165Hz screen above 60Hz :p

Depends what sort of games you play and at what level - personally I find a big difference between 60Hz and up to ~100Hz but have rapidly diminishing returns above 100Hz, but I spent many years playing games like Quake 3 online, I suspect I'd be far less sensitive to it if I'd just played slower paced stuff all my life.
 
Are you sure you are actually running the 165Hz screen above 60Hz :p

Depends what sort of games you play and at what level - personally I find a big difference between 60Hz and up to ~100Hz but have rapidly diminishing returns above 100Hz, but I spent many years playing games like Quake 3 online, I suspect I'd be far less sensitive to it if I'd just played slower paced stuff all my life.
I play RTS games like Age of Empires 2, Company of Heroes, Fallout New Vagas, Skyrim, Supreme Commander and recently I've started playing Rust. Tbh I don't play much in the way of FPS.
 
The elephant in the room here is biological variation. I notice people with no biology background assume we are all the same. If that was the case I would be able to run a sub 10s 100m and lift 500Kg. I can easily believe some people will see no difference whilst other will. After all I don't hear coil whine whilst others have the hearing of a bat and it annoys them intensely! Some people get travel sick, have no fear of heights and can actually hit a ping pong ball :cry: A trained twitch FPS gamer will notice things my 77 year old mother won't ;)

The point of this is everyone is probably right, they can or can't see/feel the difference it doesn't mean someone else is wrong.

This.

I can tell a monumental difference between 30 and 60 FPS.

Less but still can feel a difference up to 100FPS.

I can personally still tell a little up to 144hz from 100, but literally only just.

My monitor is 144hz I personally wouldn't go higher, depending on game you'd just need better hardware to push the FPS up further and I don't think it would be worth it.

I'm not saying though you'd feel no difference between 144 and say 250, but I reckon it'd be subtle, but the added investment in hardware may not be!
 
Are you asking if it's worth switching to 144 Hz? = Yes. PC will feel quicker for every task.

Are you asking if it's worth exceeding 144 Hz? = Not really*


For most people. If you're looking for the ultimate edge in gaming, yes, but otherwise not really. Law of diminishing returns.
 
I'm not saying though you'd feel no difference between 144 and say 250, but I reckon it'd be subtle, but the added investment in hardware may not be!

I have a 144Hz gaming monitor and a laptop with a 240Hz panel and I don't really notice any difference though I've not sat down to compare them - the difference if I have to run the laptop at 60Hz for some reason i.e. hooked up to a 4K monitor is very noticeable however even on the desktop.
 
60Hz comes from the old CRT video standards where the cameras would capture images at 25 frames per second (because that’s smooth motion) and the screen would further enhance the effect by drawing the image twice on two lines so it had to update twice as fast - so 50Hz and the US standard was filmed at 29-ish frames per second hence 59.9Hz or 60Hz if you round up.

It’s all based on the persistence of vision - the whole reason you see motion is that the image sits there in your eye and then you finally see the next one and your brain fills in the gaps. It fools you into seeing motion. I think everyone accepts that’s correct. Why then is it such a big jump that you’ve been further fooled by the monitor manufacturers, in cahoots with the graphics card manufacturers, to believe that higher refresh rates are smoother or more responsive. You literally cannot see the difference and your brain is making it up. It that makes to happy then you shouldn’t dismiss the audiophools who buy all that Russ Andrews garbage.

Going back to TV. They film it at 30 frames per second, then they transmit it at 30 frames per second and your TV shows it at 60/120/240/480Hz. You just see the same image on your retina. Sure the screen is refreshing that identical image but it’s the same image and your brain is fooled just the same except now you have Positìve re-enforcement of your delusion (and it is an illusion - it’s not really moving) so you BELIEVE it’s smoother. People have done far crazier things than believe in smoother video.

The 60Hz/59.9hz doesn't come from CRT video standards. They were picked to avoid interference with power line frequencies. Some companies started syncing their live broadcasts to the refresh rate. This worked with black and white but caused problems with colour unless they reduced the refresh rate to 59.94hz.
 
I went from 60 -> 120 & it was a massive difference. Then went 120 -> 165 and I am not sure I can really tell all that much. To me it seems like diminishing returns beyond a point.

I do a bit of cctv work and was having a bit of a debate with a training guy when he said I wouldn’t be able to tell the difference on anything over 25fps :cry:
 
Last edited:
Back
Top Bottom