Just trying to wrap my head around a random thought.
Am I right in thinking that all monitors used to be 60hz and we now have 120Hz monitors.
Which means that monitors can refresh now at an absolute maximum of 120 times a second.
So if a graphics card can output a game at more than 120 fps how is it possible for anyone to perceive it as the display device cannot physically change more than 120 times?
This isn't getting into my reasoning that once you are over around 40fps you are probably in placebo effect territory with how good a game looks as the eye can't tell - but I have not had the privilege to game with really high spec machines so I could just be being bitter
Am I right in thinking that all monitors used to be 60hz and we now have 120Hz monitors.
Which means that monitors can refresh now at an absolute maximum of 120 times a second.
So if a graphics card can output a game at more than 120 fps how is it possible for anyone to perceive it as the display device cannot physically change more than 120 times?
This isn't getting into my reasoning that once you are over around 40fps you are probably in placebo effect territory with how good a game looks as the eye can't tell - but I have not had the privilege to game with really high spec machines so I could just be being bitter


Especially in twitchy shooters.