• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Any reviews that compare real world differences?

My comments were more to quash the notion that rendering more frames than the screen can display is without use. Some folk will take it to heart ;)

And my nail on head comment was in relation to the comment that going from 120 to 150 on a 120 display will make a negligible difference, even to response times.

The rest of my post you took umbrage at (despite me stating response times do improve), was to demonstrate that some confuse don’t understand the difference between FPS and response times.
 
This is the kinda thing I'm getting at. It comes down to being told there's a difference but I wonder if i user could differentiate between the two on a blind test.

Most people feel the difference in response times and conflate that with “seeing” higher FPS.

Sit someone in front of a benchmark on a 60hz monitor and they would be unable to tell the difference between 60 and150 FPS other than the screen tearing.
 
Last edited:
And my nail on head comment was in relation to the comment that going from 120 to 150 on a 120 display will make a negligible difference, even to response times.

The rest of my post you took umbrage at (despite me stating response times do improve), was to demonstrate that some confuse don’t understand the difference between FPS and response times.

Fair enough; it was unclear to me I guess
 
I'd like to see someone repeatedly come back to an ever upgrading pc and identify exactly when they were experiencing a new gpu.

I reckon its clear at 4k but at 1440p on a vrr monitor I imagine people would struggle to know between say a 3080 and a 4090 unless they had a graph to observe.
 
I'd like to see someone repeatedly come back to an ever upgrading pc and identify exactly when they were experiencing a new gpu.

I reckon its clear at 4k but at 1440p on a vrr monitor I imagine people would struggle to know between say a 3080 and a 4090 unless they had a graph to observe.

VRR has been the best tech for gamers in well over a decade IMHO. Before VRR I needed 60 FPS but can now live with low to mid 40s.
 
Interesting read gentlemen so as I understand it response time is how quickly a pixel can change and refresh rate as in 165hz or 144hz and fps is how often per second normal, depended on unit of measurement the screen frame changes
 
Last edited:
Even that isn't enough. We humans have rather good eyes, capable of perceiving frequencies up to around 2kHz. So, to faithfully reproduce everything we can see, a monitor would need a refresh rate of at least 4kHz (x2 the highest frequency we can see, as per sampling theorem).
The year is 2045, the brand new Nvidia RTX 22,900Ti has just been released and I can finally play games at a locked 4000fps at 8k in VR. I’ve bought the latest seasonic small modular reactor psu to power this beast. I couldn’t afford an AC though, so I bought waterproof chair covers and a dehumidifier for the room.

Still need to turn off RT as I only get 45fps :p
 
Going above your monitor / TVs fps cap is really only useful for Esports, for everything else your better off just running with freesync / gsync to prevent tearing.
 
Wasn't GPUs, but LTT recently did a video compairing the 13900k to the 7950x in a reallistic head to head gaming test and came away saying theres no difference, buy the cheapest.
yea people not used to playing on one or the other.

swap someone's 13900k or 4090 and I bet they notice something wrong instantly, especially if it's @sewerino26 a true connoisseur
 
Most people feel the difference in response times and conflate that with “seeing” higher FPS.

Sit someone in front of a benchmark on a 60hz monitor and they would be unable to tell the difference between 60 and150 FPS other than the screen tearing.

Yes, people are looking for/noticing the artifacts of the system.

I think there's more value in screen design and post processing than mindless fps chasing.

We can "see" absurdly high fps if the way of measuring "seeing" is to see a distortion. We can't see a computer fan spinning at 2000rpm. But the stroboscopic effect means yes you can.

As long as we display motion using a sequence of hard images flashed at us there will be artifacts.
 
Back
Top Bottom