Last time I purchased a monitor was late 2007, a 22" Samsung 226BW with 2ms response time. A cracking monitor at the time.
Obviously technology has moved on quite a bit since then and we now have up to 4K resolution monitors and/or 144 Hz refresh rate monitors. Gaming however, is still focused around 1080p at 60 FPS and will remain so for quite some time to come as this is the standard that the recent consoles aim for and is also a standard which is both pleasing on the eye and doesn't cause, migraines, eye strain, or motion sickness.
Up until the last gen consoles game out, PC gamers had to generally just accept that our machines were simply not going to churn out the latest titles, maxed out with a full 60 FPS (or even 30 fps in some cases). Possibly due to the consoles forcing devs to lower the bar and/or make more efficient use of the processing power they have at their disposal, this is thankfully a thing of the past. If a PC gamer has a fairly decent PC, then he can be assured that for all but the most cutting edge or poorly optimised gfx technology, he will be able to run all the big hitting AAA titles with maxed details, 1080p at 60 FPS.
This is a reality of modern PC gaming that I am very happy about. I like just to boot up a new game, turn everything right up, and get 100% smooth performance and if not, then maybe I sacrifice some 'shadows' or MSAA to get the desired smoothness (and if I can't get this smoothness such as with titles like Arma 3 or X-Plane 10, then the game basically goes straight in the bin / never gets played).
For this reason, I have absolutely no interest in higher than 1080p monitors for my PC. This just leaves the whole 120 Hz - 144 Hz to be considered.
I want to get a good 1080p monitor that takes advantage of the latest technology to deliver the highest quality image and responses. However is there any point in getting say a 120Hz monitor if any single GPU is not going to be able to churn at a solid v-sync'd 120 fps? If I was to go for a 120Hz monitor, how much of a hassle is it generally to have games run at 60 Hz so that I get my smooth v-sync'd 60 FPS frame rates? Would the quality of a good 120Hz monitor running in '60Hz' mode be just the same as a good 60Hz monitor running at native resolution and refresh rates, or would their be 'roughs around the edges'? If I was to go for a 144Hz monitor, then I would need to run at 72Hz and 72 fps to get that stutter free smooth v-sync'd experience. However, many games are designed specifically for 60 Hz (FIFA, Pro Evo) and have problems running outwith this refresh rate.
I am sure many of you have experiences of gaming with the sort of monitor specs that I am talking above and what appreciate any tips, info, or even recommendations on specific models. Just to state in advance, if anyone is the sort of gamer who prefers to play with V-sync OFF (and is therefore of the opinion that the more FPS at any given instance, the better and screen tearing doesn't matter), then that person probably doesn't need to leave any advice in this thread. (Apparently the majority of gamers play without V-sync and put up with and/or claim not to notice screen tearing whereas I personally find screen tearing utterly intolerable and can't/won't play video games on consoles because of it......would rather have 30 fps than screen tearing)
Obviously technology has moved on quite a bit since then and we now have up to 4K resolution monitors and/or 144 Hz refresh rate monitors. Gaming however, is still focused around 1080p at 60 FPS and will remain so for quite some time to come as this is the standard that the recent consoles aim for and is also a standard which is both pleasing on the eye and doesn't cause, migraines, eye strain, or motion sickness.
Up until the last gen consoles game out, PC gamers had to generally just accept that our machines were simply not going to churn out the latest titles, maxed out with a full 60 FPS (or even 30 fps in some cases). Possibly due to the consoles forcing devs to lower the bar and/or make more efficient use of the processing power they have at their disposal, this is thankfully a thing of the past. If a PC gamer has a fairly decent PC, then he can be assured that for all but the most cutting edge or poorly optimised gfx technology, he will be able to run all the big hitting AAA titles with maxed details, 1080p at 60 FPS.
This is a reality of modern PC gaming that I am very happy about. I like just to boot up a new game, turn everything right up, and get 100% smooth performance and if not, then maybe I sacrifice some 'shadows' or MSAA to get the desired smoothness (and if I can't get this smoothness such as with titles like Arma 3 or X-Plane 10, then the game basically goes straight in the bin / never gets played).
For this reason, I have absolutely no interest in higher than 1080p monitors for my PC. This just leaves the whole 120 Hz - 144 Hz to be considered.
I want to get a good 1080p monitor that takes advantage of the latest technology to deliver the highest quality image and responses. However is there any point in getting say a 120Hz monitor if any single GPU is not going to be able to churn at a solid v-sync'd 120 fps? If I was to go for a 120Hz monitor, how much of a hassle is it generally to have games run at 60 Hz so that I get my smooth v-sync'd 60 FPS frame rates? Would the quality of a good 120Hz monitor running in '60Hz' mode be just the same as a good 60Hz monitor running at native resolution and refresh rates, or would their be 'roughs around the edges'? If I was to go for a 144Hz monitor, then I would need to run at 72Hz and 72 fps to get that stutter free smooth v-sync'd experience. However, many games are designed specifically for 60 Hz (FIFA, Pro Evo) and have problems running outwith this refresh rate.
I am sure many of you have experiences of gaming with the sort of monitor specs that I am talking above and what appreciate any tips, info, or even recommendations on specific models. Just to state in advance, if anyone is the sort of gamer who prefers to play with V-sync OFF (and is therefore of the opinion that the more FPS at any given instance, the better and screen tearing doesn't matter), then that person probably doesn't need to leave any advice in this thread. (Apparently the majority of gamers play without V-sync and put up with and/or claim not to notice screen tearing whereas I personally find screen tearing utterly intolerable and can't/won't play video games on consoles because of it......would rather have 30 fps than screen tearing)
Last edited: