I bought a 1440p 144HZ monitor a couple of years ago from OCUK. For the first year or so I loved it, but after the novelty of it wore off I can't help but wonder if i made the wrong choice. Gaming at 1440p is becoming more and more expensive, especially if your like me and want to run at high refresh rates consistently over 100fps. The last two months I have been looking at building a new PC as mine is showing its age now, but I keep coming unstuck when I look at what GPU to get. Seeing as SLI is all but dead now, I pretty much have to go with either a 2080/2080ti or possibly the new AMD VII in order to guarantee high fps at 1440p, and they are all so bloody expensive. I am seriously debating going back to 1080p so I can play at 144hz and not get my pants down.
Am I mad, or does anyone else feel the same way?
Strange, I have a 1440p screen and had no issues with a RX480, then eventually got a Vega when it was cheap with a game I wanted anyway also bringing the price down.
Because there is this thing you can do. Don't turn on monumentally performance destroying graphics settings that offer little to no IQ improvement. DoF reduces performance and actively reduces IQ. If you're looking at a part of the screen where it's working, because you're looking at it... it shouldn't be working and it's not realistic it's just blurring an image you're trying to focus on. If you're not looking at it you're burning excess power for no benefit. Motion blur, same deal, ultra shadows which look all but identical to high but uses 20% more power because once in every blue moon one shadow looks marginally different but you can only notice it if you stop and stare at the ground for hours and you literally can't notice it while running through a level shooting at people.... yeah, turn that off.
If people insist on going into a graphics menu and whacking everything to ultra/mega modes or "we added this because Nvidia wanted to sell more high end cards" settings (they are often not called that in game) then you'll have a hard time maintaining FPS. If you actually sensibly go through settings and turn things down only to find little to no IQ change but a significant performance bump, then playing at 1440p is no problem.
Also you know, set the game to 1080p and let scaling help with the rest.
What I find with gaming is people paint themselves into these artificial corners where 1440p isn't working for you... even though choosing 1080p in the game is an option, you just won't use it. But like I said, I've never had an issue with a game where framerate is peeing me off. Freesync helps in some games, but I generally have high enough refresh rate in general that I'm not bothered. This is from someone who points out to people for years that 120hz is better than 60hz even for desktop work let alone gaming. But this is also someone who tells you that the difference between 60hz/fps and 80fps is much bigger than the difference between 80 and 100 hz/fps, which is much bigger than the difference between 100 and 120, etc.
It's very much a diminishing returns situation. Stop getting hung up on ultra settings, realise that in probably 95% of games the top couple of settings minor or no real IQ gains for potentially 50% performance penalty.
THe rare game is so badly optimised that there is little difference between high and even lowest settings and you're getting 50fps either way... but at 1080p in those games you'd probably end up with 60fps because the bottlenecks are all over, cpu, gaming and strange limitations in engines like 60fps caps tied to physics engines (screw you Bethesda). Some games are going to run poorly no matter what, the rest can almost all be made to run great on even midrange cards at higher resolutions.