i find it impossible to play games these days on ultra because quite frankly screens have advanced so quickly and yet coding and GPU's are quite stagnant.
take PUBG as an example - it's horribly coded. yet one of the most popular games on steam. what GPU do you need to run that on ultra 1440p @ 165 hz?
then take games which look beautiful like RDR2 and what do you need to run that on even high at 1440P and 165hz?
first world problems i know but there is a such a huge gap between power needed and what is available and he price of it.
my mate got rid of his 4K screen because even though he had dual 1080ti's he still couldn't play all games maxxed out.
The thing is,a lot of people say they "need" to play at ultra,or 200fps,or 4K,etc and then moan about GPU prices,then buy it because they "need" to.The problem is they have boxed themselves into this corner,and they are at the mercy of a company.Companies don't like negative PR like gamers moaning as they are scared it will reduce sales,but if people moan and then still buy GPUs they moaned are "too expensive",it tells companies that the set prices are fine. Probably better not to moan then!
There is no need for any of that - run at a lower resolution,lower FPS,lower settings,etc. This is what we all had to do 15 years ago,even back in the day when there was more GPU competition. So many games are either poorly coded,or designed to take advantage of future hardware,so in some ways its a fools errand to throw money at them beyond a certain point IMHO OFC. Look at Crysis,I ran it in DX9,made some manual edits to the config file,and ran it at one notch lower than native resolution,and I was OK. But look at all the people running it in DX10,higher resolutions,expecting higher FPS,etc who ended up throwing tons of money at that time at the game.
Another example of a game which looks nice but was poorly optimised is ARK - I bought it years ago and promptly refunded it when it ran like crap. I was shocked at how badly optimised it was and I was not going to throw money at it. I got it again a few years later when the devs got off their arse and made it run better. I am not here to fund lazy devs,who want to save money in not doing any optimisation to their games,so we need to spend £100s or even more to compensate.
Its when you look at some of the basic mechanisms these devs DON'T implement in games like not doing proper object culling,instanced geometry,etc,which leads to things like excessive CPU load on the rendering thread,and a huge amount of drawcalls which tanks performance,whilst the game doesn't look that hot.
Unlike in the past,we have so many games - how many people have huge libraries of games,that they have barely touched? I could stop buying any new games for the next two years and still probably not finish my backlog. So now,I am making sure,if games want to take the **** with hardware specs,they better have a good reason to!