Is it worth going over 144Hz?

Dude, no offence but that is an insane opinion, so it's difficult to take you seriously.

Playing forza horizon 5, the visuals are lovely and smooth @1440p 144hz or 165hz on my monitor. (Dell S2721DGF for info).

It's even fine at 120hz, I've tested it several times...presumably as I average about 100-120fps with in game graphics set to very high or above.

Same when playing on My LG TV, but that can only do 120hz max.


If I force it down to 60hz, it's, well, not good.. fast moving images look juddery like it's dropping frames or something, it's especially obvious in a racing game at high speed.

Don't get me wrong, like others have said, going from 144hz to 165hz for example, I think I can't really see any difference.

But the difference between 60hz and 144, is very very obvious, assuming you can run your chosen game at say an average 100+ fps.

If your PC is only powerfull enough to run at sub 80fps averages, then I can see why you would think it makes no difference, but that's because your PC is a potatoe.

It’s difficult to take advice from someone who can’t spell potato but moving on, I’m not saying you don’t believe it. People believe all sorts of things. But they’re not backed up by science. Why can you not see any benefit between 144Hz and 165Hz? Because maybe your reality filter is finally working properly and the placebo effect has worn off?

Like I said, once you have a graphics card with a frame rate faster than your refresh rate it’s pointless paying more for the graphics card because even people who can’t spell potato know the refresh rate has to be as high as the frame rate. So they had to sell these high refresh rate monitors or people wouldn’t continue to buy stupidly expensive graphics cards. QED.
 
It’s difficult to take advice from someone who can’t spell potato but moving on, I’m not saying you don’t believe it. People believe all sorts of things. But they’re not backed up by science. Why can you not see any benefit between 144Hz and 165Hz? Because maybe your reality filter is finally working properly and the placebo effect has worn off?

Like I said, it’s pointless paying more for the graphics card because even people who can’t spell potato know the refresh rate has to be as high as the frame rate. So they had to sell these high refresh rate monitors or people wouldn’t continue to buy stupidly expensive graphics cards. QED.

I'm not trying to be a D**k, so I'll answer, in line... honestly, and without picking you up on your pedantry (I spell checked that one! ;))
Why can you not see any benefit between 144Hz and 165Hz?
It might be my eyes, I'm 44 years old. But to be fair, the old myth was the human eye cannot see more than 60fps/60hz.... well I can promise you that is not true, to put it politely.
As with all hardware or anything for that matter, there is diminishing returns, but those diminishing returns probably start at 120hz, rather than 60hz, but I would suggest 144hz as a happy medium.
once you have a graphics card with a frame rate faster than your refresh rate

That doesn't make much sense.. your frame rate is entirely dependent on your resolution and refresh rate, vis-à-vis your pc system as a whole, with a huge co-dependecy on what game you are playing and wht setting you want to run at.

I'm not trying to argue that runing mincraft at 800fps will look any better than running it at 60.

 
Last edited:
If the question is purely , "will you notice a difference" between 60Hz and 144Hz the answer is yes you will. For me swapping between gaming on a 4K 60Hz TV and my 27" 1440p 144Hz monitor its very obvious, 60Hz is fine but 144Hz is definitely better.
FYi I think 27" 1440p 144Hz is the sweet spot for cost / performance / size etc, check the Members Market, theres always ones popping up in there (its where I got mine from).
 
I'm not trying to be a D**k, so I'll answer, in line... honestly, and without picking you up on your pedantry (I spell checked that one! ;))

It might be my eyes, I'm 44 years old. But to be fair, the old myth was the human eye cannot see more than 60fps/60hz.... well I can promise you that is not true, to put it politely.
As with all hardware or anything for that matter, there is diminishing returns, but those diminishing returns probably start at 120hz, rather than 60hz, but I would suggest 144hz as a happy medium.


That doesn't make much sense.. your frame rate is entirely dependent on your resolution and refresh rate, vis-à-vis your pc system as a whole, with a huge co-dependecy on what game you are playing and wht setting you want to run at.

I'm not trying to argue that runing mincraft at 800fps will look any better than running it at 60.


The potatoe spelling is inextricably linked to Dan Quayle which is why it’s so jarring.

So, to tackle your points in order;

Firstly, you are saying that established factual science is wrong and that you can perceive images at speeds that would, quite literally be super-human. The photo-chemical reaction in your eye, coupled to the speed of nervous transmission are both known and thoroughly researched. Bottom line, you’re deep into a false belief there. Maybe you are super-human. I’m going to go on the balance of probability that your belief system has driven a lot of placebo effect here.

Then you start saying that the smoothing effect lessens much over 120Hz. Again, there is no rational reason for such an effect. And why can some people perceive these effects at much higher refresh rates? Is there a gamer super-human spectrum and you’re just middling?

If you’re honest at this point then shouldn’t you be advising the OP to get the highest refresh rate monitor possible because they might be at the extreme high end of the gamer super-human spectrum?

And then you query the point about frame rate, resolution and refresh rate. There are two different parts this. The graphics card thserves The monitor with frames. Anything over 24 frames second and you see smooth motion. You then have the monitor which can accept frames as fast as the graphics card can send them. What it can’t do is turn the individual page of pixels on and off any faster than its maximum refresh rate. So if you send a 60Hz monitor 120 frames per second it can’t refresh fast enough to show you all those frames. So if you have a 120 frames second graphics card feed you need a 120Hz refresh monitor to even have a chance of seeing every frame. That’s why the monitor and the graphics card have to synchronise to ensure that the frames are sent neatly in between the refreshes. So if you have a 240 Frames per second graphics card it needs a 240Hz refresh rate of you’ve wasted your money and the monitor will sync with the graphics card at its maximum frame rate that matches the refresh rate. Hence my point that unless you can buy a high refresh rate monitor there is no point buying a high frame rate graphics card. The resolution is irrelevant for the purposes of this discussion because the monitor is just turning on and off bigger blocks of filters.

I’m not saying you don’t believe you can see a difference, it’s just your brain justifying the extra cost.
 
The potatoe spelling is inextricably linked to Dan Quayle which is why it’s so jarring.

So, to tackle your points in order;

Firstly, you are saying that established factual science is wrong and that you can perceive images at speeds that would, quite literally be super-human. The photo-chemical reaction in your eye, coupled to the speed of nervous transmission are both known and thoroughly researched. Bottom line, you’re deep into a false belief there. Maybe you are super-human. I’m going to go on the balance of probability that your belief system has driven a lot of placebo effect here.

Then you start saying that the smoothing effect lessens much over 120Hz. Again, there is no rational reason for such an effect. And why can some people perceive these effects at much higher refresh rates? Is there a gamer super-human spectrum and you’re just middling?

If you’re honest at this point then shouldn’t you be advising the OP to get the highest refresh rate monitor possible because they might be at the extreme high end of the gamer super-human spectrum?

And then you query the point about frame rate, resolution and refresh rate. There are two different parts this. The graphics card thserves The monitor with frames. Anything over 24 frames second and you see smooth motion. You then have the monitor which can accept frames as fast as the graphics card can send them. What it can’t do is turn the individual page of pixels on and off any faster than its maximum refresh rate. So if you send a 60Hz monitor 120 frames per second it can’t refresh fast enough to show you all those frames. So if you have a 120 frames second graphics card feed you need a 120Hz refresh monitor to even have a chance of seeing every frame. That’s why the monitor and the graphics card have to synchronise to ensure that the frames are sent neatly in between the refreshes. So if you have a 240 Frames per second graphics card it needs a 240Hz refresh rate of you’ve wasted your money and the monitor will sync with the graphics card at its maximum frame rate that matches the refresh rate. Hence my point that unless you can buy a high refresh rate monitor there is no point buying a high frame rate graphics card. The resolution is irrelevant for the purposes of this discussion because the monitor is just turning on and off bigger blocks of filters.

I’m not saying you don’t believe you can see a difference, it’s just your brain justifying the extra cost.

Do u even understand motion blur bro?

The reason 24fps is used in cinema is because of motion blur. Without blur, you're gonna have a bad, bad, bad, bad time.

The same doesn't apply to games.

 
Last edited:
Do u even understand motion blur bro?

The reason 24fps is used in cinema is because of motion blur. Without blur, you're gonna have a bad, bad, bad, bad time.

The same doesn't apply to games.


I understand persistence of vision. Which you are incorrectly calling motion blur, my brother from another mother…
 
The human eye / brain ability to see discrete 'frames' is not the same as the ability to perceive fluidity of motion.

Most people can detect changes in fluidity well above their ability to see 'frames' hence the benefit of higher FPS... to a point.
 
The human eye / brain ability to see discrete 'frames' is not the same as the ability to perceive fluidity of motion.

Most people can detect changes in fluidity well above their ability to see 'frames' hence the benefit of higher FPS... to a point.

I think you're missing what the refresh actually does (which is the point of this thread). The refresh is nothing to do with motion its how fast the monitor can switch the filters over the LEDs on and off so the next frame can be drawn. Frames per second <> refresh rate.

There is no such thing as moving pictures. It all relies on persistence of vision where it seems like the image is moving. There is no magic in this. 24 frames per second and you see smooth motion. That has nothing to do with the refresh rate. That’s why pretty much everything from TV to cinema to surveillance cameras use some very low number of frames per second to give a very decent motion picture experience. Would you like to guess the refresh rate of a very high quality digital cinema projector? 60Hz. Because people who REALLY care about motion think that’s smooth enough.

And where it gets really squirrely is when you understand how the monitor draws the colours for you. Depending on what scientific text you read the ability of the cones in your eye to respond to colour is in the order of 100ms which is the same as 0.1s which I think is only 10Hz so now you have a monitor that can turn the pixels on and off 6 - 24 times as fast as your eye can process that colour change. But as you guys are all on the gamer super-human eyesight spectrum I guess you really can perceive those colour changes 10-17 times faster than physics suggests is possible.

Can I interest you in this Russ Andrews HDMI cable? It’s made of pure copper and it will really make your images buttery smooth. It goes to refresh rates over 5GHz. Only £2000. Plus P&P.
 
Last edited:
There is no such thing as moving pictures. It all relies on persistence of vision where it seems like the image is moving. There is no magic in this. 24 frames per second and you see smooth motion. That has nothing to do with the refresh rate. That’s why pretty much everything from TV to cinema to surveillance cameras use some very low number of frames per second to give a very decent motion picture experience. Would you like to guess the refresh rate of a very high quality digital cinema projector? 60Hz. Because people who REALLY care about motion think that’s smooth enough.

This is only applicable to content that *contains* motion blur, and with cinema there's a lot of it. If you pause a movie during any scene apart from what is essentially a still, you'll see horrendous blurring that is key to making the low fps seem alright. Games don't have this, or at least anywhere near the same amount, so benefit more from frame rate hikes. Did you have a look at the link I posted?
 
Last edited:
This is only applicable to content that *contains* motion blur, and with cinema there's a lot of it. If you pause a movie during any scene apart from what is essentially a still, you'll see horrendous blurring that is key to making the low fps seem alright. Games don't have this, or at least anywhere near the same amount, so benefit more from frame rate hikes. Did you have a look at the link I posted?
What’s it got to do with refresh rates? Refresh rates are NOT frames per second.
 
Why are you asking that? Your argument is that 24fps = smooth. My point is this is not true, it only appears smooth if motion blur is present in the frames.
 
Last edited:
While with adaptive sync I find 60Hz acceptable for single player gaming, without adaptive sync it isn't even remotely so, I find a noticeable difference between that and 100+Hz and it is far nicer to play even single player games with 100+Hz and some form of adaptive sync.

Can't say I'd especially bother with higher than 144Hz unless into super competitive gaming.

My current gaming setup is a 43" 4K 60Hz monitor with G-Sync compatible and a 27" 1440p 144Hz monitor with full G-Sync so I do have some idea :s

Also almost all VA displays are trash for any kind of motion gaming - there are a small number of exceptions - the 4K one I use is a VA QD-LED and almost free of the normal VA issues, not quite totally free but so minor you don't really notice.
 
Last edited:
Can I interest you in this Russ Andrews HDMI cable? It’s made of pure copper and it will really make your images buttery smooth. It goes to refresh rates over 5GHz. Only £2000. Plus P&P.

Any chance you can hold that for me, I still have a couple of payments on my gold plated TOSlink cable
 
So the graphics card manufacturers and the monitor manufacturers will cheerfully tell you that more is better when truthfully 24fps at 59.9Hz is more than enough for smooth motion otherwise we’d not watch TV or go to the cinema.

Smooth motion maybe, responsive motion definitely not - but go into any game and cap at 24-25FPS and you'll still notice some judder (amongst other things very few games have completely even frame pacing so you need much higher frame rates to minimise how noticeable the variation is), it doesn't really go away until you are closer to twice that (even if you have a setup which eliminates stutter from the frame rate and refresh rate not matching and/or matching them), and it won't feel smooth (if you are interacting with the scene rather than watching a cutscene) until you are over ~48FPS probably closer to 60. Which is OK for single player games but any kind of fast paced gaming there is a noticeable improvement in responsiveness up to around 100FPS or so with for most people fast diminishing returns above that.

EDIT: If you are on a console with a game capped at 30FPS where everything is matching up frame time wise then 30FPS will look and feel smooth if a little laggy but on a keyboard and mouse you will definitely notice the latency and any kind of fast motion will make you wish you had more than 30FPS.
 
Last edited:
24 frames per second is the point at which your eyes are incapable of separating the images they see. But, and this is critical, the next image it perceives is slightly different to the last one. Which confuses your brain. So your brain fills in the gap and you fool yourself into seeing motion.

Your brain does that all by itself. You fool yourself into seeing motion.

Now, if your brain cannot separate two images at 24 frames per second because the image persists on the retina, how is your brain going to see separate images at 240 refreshes per second? It can’t. It just can’t.

But your brain has an advantage now, because you watched a YouTube video and the nice people there said you would see smoother images if you up the refresh rate. And you believe it and because you believe it you think you see it the same as you think you see motion. So it’s not science, it’s very much faith driven. You believe it’s true, so your brain sees it for you.

You’ve been persuaded by people whose livelihoods are based on testing the next latest and greatest thing that the tech manufacturers put out.

And the thing is, you can see people’s reality filters kicking in. The people who can’t see any difference over 100Hz, or the people who can’t see it over 144Hz. Despite the indoctrination and the fact that people are stating “higher refresh = smoother” is a fact the reality filter is telling you that you can’t see it. Because you can’t.

I genuinely feel like Richard Dawkins at a homeopathy conference…
 
Last edited:
MotionBlur_motionblur_example.gif

Same fps, different smoothness. Because motion blur exists in the left and does not in the right. Motion blur is actually present in the content. This is not something we simply perceive, it is in the data. It is not a result of anything our eyes or our brains do, it is caused by shutter speed, inherent to the the method of image capture, or added artificially.

Crucially, this is not something that is inherent to image capture in games, where the images can be presented with zero motion blur (although, some games actually do add it it, and it assists with smoothness!).

If you clicked the link I kindly posted above, you'd be able to play around with this effect in real time and come to understand the relationship between motion blur, smoothness, speed of movement and fps, but I guess we are where we are and I'm really bothering to post a gif about this.
 
Last edited:
Back
Top Bottom