Is it worth going over 144Hz?

MotionBlur_motionblur_example.gif

Same fps, different smoothness. Because motion blur exists in the left and does not in the right. Motion blur is actually present in the content. This is not something we simply perceive, it is in the data. It is not a result of anything our eyes or our brains do, it is caused by shutter speed, inherent to the the method of image capture, or added artificially.

Crucially, this is not something that is inherent to image capture in games, where the images can be presented with zero motion blur (although, some games actually do add it it, and it assists with smoothness!).

If you clicked the link I kindly posted above, you'd be able to play around with this effect in real time and come to understand the relationship between motion blur, smoothness, speed of movement and fps, but I guess we are where we are and I'm really bothering to post a gif about this.
Still not to do with refresh rates. You’re artificially making one blurry. That’s cheating.
 
I can happily game on 60hz so no not worth going over 144hz for me, but i do see a benefit going from 60 to 144 or 165.
Never tried higher.
 
Still not to do with refresh rates. You’re artificially making one blurry. That’s cheating.
But the point I'm making is against your point, which is about FPS, where you say 24fps is enough for smooth motion. My point is that it is only OK due to motion blur being present in the frames. Once you understand this, everything else follows. Motion blur introduced by processing vs motion blur introduced by shutter speed has the same smoothing effect. That's actually a key point in the proof.
 
Now, if your brain cannot separate two images at 24 frames per second because the image persists on the retina, how is your brain going to see separate images at 240 refreshes per second? It can’t. It just can’t.

The average human eye/brain can detect pulsed light down to 10s of nanoseconds, most people can detect flicker in lighting lasting as little as 8-9ms - it is a lot more complex than persistence of vision. But even if the human eye couldn't detect over 24FPS there is still a problem in that games do not output evenly timed frames - at 24FPS you might have say 17 frames in the first half of the second (equivalent to 34 FPS, then 7 frames in the last half (equivalent to 14 FPS) and even then the frame pacing might be so off at times you are experiencing more like 10-12FPS at parts of that second - you need a much higher frame rate to offset that uneven distribution of frames.
 
Last edited:
End of the day it is pretty easy to see - cap a game to 24-25FPS, even using CPU wait to enforce as even frame time as possible, and play for a bit then uncap it and the difference is very noticeable and you can easily see a benefit all the way up to well past 60FPS/60Hz depending on the person.

I would do a video but most videos alter the effect of it somewhat (and YT only goes to 60FPS) and additionally it doesn't have the same impact as feeling the difference in an interactive scene.
 
Last edited:
Anything over 59.9Hz is flicker-free and 24 frames per second is smooth motion. What you convince yourself of beyond that is up to you.
I don't need to convince myself of anything - I've experienced it, which seems to be something you've convinced yourself that you won't try anything >60hz.

I tried 4k screens at work when they first came out and most cards could only run them at 30hz - even non-techy users who came into my office could see the difference and asked what was wrong with it!

My main monitors at work are 60hz, at home I have a 75hz, and the Kids have 144hz, I can tell the difference.

Edit: I was also (un)lucky enough to see one of the Hobbit films in high frame rate - again absolutely night and day in terms of motion smoothness, but apparently I shouldn't be able to see a difference?
 
Last edited:
Let him run at 60hz/1080/VA. He seems happy, and at the end of the day, that's what counts.

Probabably time to lock the thread, or take it to GD.
 
Last edited:
End of the day it is pretty easy to see - cap a game to 24-25FPS, even using CPU wait to enforce as even frame time as possible, and play for a bit then uncap it and the difference is very noticeable and you can easily see a benefit all the way up to well past 60FPS/60Hz depending on the person.

I would do a video but most videos alter the effect of it somewhat (and YT only goes to 60FPS) and additionally it doesn't have the same impact as feeling the difference in an interactive scene.
You constantly mix up frame rate and refresh rate. They’re not the same thing. The reason pretty much all religions suppress science is that reality and belief tend not to mix well. Just consider what I wrote about your brain convincing itself it sees motion and then honestly ask yourself if maybe it’s not the Emperor’s New Clothes?
 
Let him run at 60hz/1080/VA. He seems happy, and at the end of the day, that's what counts.

Probabably time to lock the thread, or take it to GD.
Why lock it? Because the heretic might actually start getting through to the sheep in the flock? Simple truth will out. Your mind is lying to you. And no-one like to admit they’ve been fooled.
 
You constantly mix up frame rate and refresh rate. They’re not the same thing. The reason pretty much all religions suppress science is that reality and belief tend not to mix well. Just consider what I wrote about your brain convincing itself it sees motion and then honestly ask yourself if maybe it’s not the Emperor’s New Clothes?

I'm not mixing up frame rate and refresh rate at all - there might be occasions where my phrasing is a bit clunky. I do video game development/modding as a hobby I have a pretty good idea between the two and what is going on underneath.

I've been doing this for near 30 years :s
 
I don't need to convince myself of anything - I've experienced it, which seems to be something you've convinced yourself that you won't try anything >60hz.

I tried 4k screens at work when they first came out and most cards could only run them at 30hz - even non-techy users who came into my office could see the difference and asked what was wrong with it!

My main monitors at work are 60hz, at home I have a 75hz, and the Kids have 144hz, I can tell the difference.

Edit: I was also (un)lucky enough to see one of the Hobbit films in high frame rate - again absolutely night and day in terms of motion smoothness, but apparently I shouldn't be able to see a difference?

60Hz comes from the old CRT video standards where the cameras would capture images at 25 frames per second (because that’s smooth motion) and the screen would further enhance the effect by drawing the image twice on two lines so it had to update twice as fast - so 50Hz and the US standard was filmed at 29-ish frames per second hence 59.9Hz or 60Hz if you round up.

It’s all based on the persistence of vision - the whole reason you see motion is that the image sits there in your eye and then you finally see the next one and your brain fills in the gaps. It fools you into seeing motion. I think everyone accepts that’s correct. Why then is it such a big jump that you’ve been further fooled by the monitor manufacturers, in cahoots with the graphics card manufacturers, to believe that higher refresh rates are smoother or more responsive. You literally cannot see the difference and your brain is making it up. It that makes to happy then you shouldn’t dismiss the audiophools who buy all that Russ Andrews garbage.

Going back to TV. They film it at 30 frames per second, then they transmit it at 30 frames per second and your TV shows it at 60/120/240/480Hz. You just see the same image on your retina. Sure the screen is refreshing that identical image but it’s the same image and your brain is fooled just the same except now you have Positìve re-enforcement of your delusion (and it is an illusion - it’s not really moving) so you BELIEVE it’s smoother. People have done far crazier things than believe in smoother video.
 
Last edited:
I'm not mixing up frame rate and refresh rate at all - there might be occasions where my phrasing is a bit clunky. I do video game development/modding as a hobby I have a pretty good idea between the two and what is going on underneath.

I've been doing this for near 30 years :s
Then you’ll already know what I’ve typed above about how video standards work. I’m not going to apologise for pointing out that the Emperor has no clothes on.
 
Last edited:
Then you’ll already know what I’ve typed above about how video standards work. I’m not going to apologise for pointing out that the Emperor has no clothes on.

What you are talking about are the minimums to see smooth motion - it doesn't mean that you can't notice any increases in smoothness.

End of the day even if what you are talking about was correct, and it isn't, it is really easy to go into any game and cap frame rate to 24-25FPS, set refresh to whatever you want or use adaptive sync, and compare it to higher frame rates and see that the way games work means that frame rates below somewhere in the 40s just isn't ideal and even on consoles where you might have a game, machine architecture and display optimised around a fixed 30 FPS it might look smooth but you'll still feel the latency somewhat even on controller and very much if using mouse input and in games which support 60FPS there is a noticeable difference in smoothness. There is no aspect of the brain being fooled or fooling you there.
 
The fundamental problem you face with trying to explain what you see is that the motion is caused because the image is retained on your eye and it’s fixed there for roughly 1/24th of a second and then you see the next image, your brain fills in the gap as motion. You can’t shorten that 1/24th of a second so you can refresh the image as fast as you like but until 1/24th of a second has passed you can’t see another image. If what you are saying is correct then the OP would not be asking the question. Instead people are saying they can’t see any benefit over this frame rate or that frame rate and it’s simply not possible.

It’s literally mass hysteria. You have all convinced yourselves that you can see this smoothness when it’s literally physically impossible.
 
It’s literally mass hysteria. You have all convinced yourselves that you can see this smoothness when it’s literally physically impossible.
So Soap Opera effect isn't a thing? Because my eyes can't possibly see the extra frames when viewing 60hz broadcast material or indeed the 48fps High Framerate version of the Hobbit I previously mentioned?

3:2 pulldown (or the equivalent when watching a 24fps content on any non-24hz multiple) stuttering shouldn't be a thing either, because my eyes couldn't possibly discern frame rate variation once it's at least 24fps
 
So Soap Opera effect isn't a thing? Because my eyes can't possibly see the extra frames when viewing 60hz broadcast material or indeed the 48fps High Framerate version of the Hobbit I previously mentioned?

3:2 pulldown (or the equivalent when watching a 24fps content on any non-24hz multiple) stuttering shouldn't be a thing either, because my eyes couldn't possibly discern frame rate variation once it's at least 24fps
Soap opera effect is because of the camera they film it with. Can we keep this on topic please?
 
Again, not refresh rate related.
But it's a key element of this discussion. The fact that higher FPS increases the information available and thus the smoothness (because we are able to perceive it) is something you're denying. Do you see no difference between them in this video?
 
Last edited:
Back
Top Bottom