Is 1440p the next best thing?

Soldato
Joined
26 Aug 2004
Posts
5,162
Location
South Wales
So upon checking out benchmarks of the Titan x, it would seem single cards are starting to get to the point of powering 1440p pretty well. Not interested in the card due to price, but will likely pick up an R9 390 or 390x when they come out.

With that card upgrade i thought it might be nice to go from 1080p 120hz to 1440p Freesync 144hz. But is the difference there in games to be worth it? Or is the main thing with them the extra desktop space that you get?

I also half consider 4K, but since it needs 2 higher end cards to get good frame rates i would rather steer clear, mostly due to micro stuttering that still exists.. If it wasn't a thing (and it most certainly is) i would strongly consider it.

So i would rather 1 card that runs things smoothly instead.

What's most peoples thoughts on that? Just looking for info.
 
Guess to a degree it depends a bit person to person.

I've tried 1920x1080 @ 120Hz (TN), 2560x1080 @ 75Hz (IPS), 3840x1440 @ 60Hz (IPS), 3840x2160 @ 60Hz (TN), 2560x1440 @ 60/120+Hz (IPS/TN) recently for gaming.

All said and done I've settled on 2560x1440 @ 120Hz - current higher ends cards can easily power the resolution at decent framerates and quality settings, there are very few compatibility issues or other scaling considerations, etc. as you get with 21:9 and 4K panels, I felt a bit more connected to the game at sub-4K as well (bit hard to explain) plus the extra screen estate over 1920x1080.

Some games at 4K are immense especially if you've got like a 40"+ panel and playing games on controller (not really my thing aside from racing games, etc.) but I do too much FPS gaming with keyboard and mouse and even in slower games really miss having 120+Hz (also less of an issue with a controller).

21:9 with freesync/gsync could be pretty good I think - I quite liked it for single player gaming but none of the current panels have the performance/clarity really for fast paced motion tracking and without vsync tearing is much more apparent than 120+Hz and even at 75Hz there is too much input latency for me to have it on.

I've personally not had too much trouble with microstutter and SLI with recent games BF4 aside which ran pretty badly with default settings and was never completely smooth even with tweaks. There are a few caveats to that - if you need 2-3 of the latest and greatest cards just to get 30fps with a multi GPU setup then you will never get away from microstutter, etc. multi GPU works best where your getting say 40-50fps already with one card and want to use the additional card to top it upto a constant 60fps (or similar but for 120fps).
 
I think the best thing about going to 1440p is that you can go to 27'' and it feel great. For 80% of the time I think 27'' is the sweet spot for gaming and in general. Another good thing about 1440p is that you have way less jaggies than at 1080p which is still apparent even with 8x msaa (goes without saying not all games suffer as much from it). Is that a big deal considering you can just use a custom resolution and achieve the same effect in terms of smoothness? That's up to you. If you don't already have freesync/gsync, I'd say definitely go for it. If you do, then just flip a coin.
 
Some good info here thanks, i mainly play fps games so i suppose a 1440p freesync screen would be best then? Is there much noticable difference in clarity from 1080p or would the difference be much bigger to 4K? If going from a 27" 1080p to one of those at the same size.
 
I've been using 1440p for the past 2 years and with all the console ports coming and limiting improvements it is the best thing, not many cards have came out in the last 2 years either so going 144hz 1440p is probably the best idea, won't run into as much VRAM limits either vs UHD.
 
I've been using 1440p for the past 2 years and with all the console ports coming and limiting improvements it is the best thing, not many cards have came out in the last 2 years either so going 144hz 1440p is probably the best idea, won't run into as much VRAM limits either vs UHD.

What do you think of the increased quality of 1440p, does it stand out much in games?

My current screen is a few years old, so it's not got freesync/gsync but when i go for a new one may as well get one with freesync, it seems it works very well from what i have read.
 
1440P to me is the sweetspot for gaming.

Not too hard to run if you have a top end GPU and the picture quality over 1080P is lush.

Any higher i.e 4K and most of the time you need serious GPU grunt although playing older games at 4K with a single GPU with good FPS is also quite nice.

For me for the foreseeable future 1440P is the way to go :)
 
I would say that unless you actually run 1440p and 1080p side-by-side you won't really remember the difference that much. For newer games no single-card really does 1440p at >100 fps reliably so it's kinda awkward. And for older games going up in image resolution gives trivial image quality improvements, not more so than just running downsampling. I'd still say go for 1440p though if you plan on keeping the monitor for 4-5 years. Though with the current rate of advancements in monitors, I don't know if I'd do that either.
 
Last edited:
I would say that unless you actually run 1440p and 1080p side-by-side you won't really remember the difference that much. For newer games no single-card really does 1440p at >100 fps reliably so it's kinda awkward. And for older games going up in image resolution gives trivial image quality improvements, not more so than just running downsampling. I'd still say go for 1440p though if you plan on keeping the monitor for 4-5 years. Though with the current rate of advancements in monitors, I don't know if I'd do that either.

Have you compared them side by side or seen the difference?

Another option is having a 4K monitor to use as well as the one i have now, but swapping them becomes a pain in the ass. Not sure if worth doing that because it will likely only be good for games that are at least 5 years old, then it might not get much use anyway.

You are right at the moment monitors are advancing at a decent pace, but the power of single graphics cards are not keeping up with the advancement.. not to 4K at least.
 
IIRC you can use 1080p perfectly on a 4K monitor without the blurring normally associated with non-native resolutions. I think 4K would therefore be a good upgrade for most people over the next couple of years.
 
What do you think of the increased quality of 1440p, does it stand out much in games?

My current screen is a few years old, so it's not got freesync/gsync but when i go for a new one may as well get one with freesync, it seems it works very well from what i have read.

It is a massive difference imo, I hate 1080p and could never use it for a monitor again, tried UHD and the difference isn't as much as going from 1080p to 1440p.
 
1440p is definitely the sweetspot, especially with both nvidia and amd having downsampling support. I thought there would be almost no noticeable jaggies back when I got my 1440 display but you still need around 4xmsaa or equal if you want a near perfect image. If theres no aa support in the game you can just downsample now anyway to get the best of both. The pixel density is also pretty perfect on 27 inch, though id go for a 25 inch now for the extra pixel density if i had the choice.
 
Back
Top Bottom