Completely agree with this video, so many games I've played over the last couple of years have barely looked any better on "ultra" than "high/very high" preset, only noticeable difference has been the FPS hit....
Personally I also find that some graphical settings on max can make a game look worse i.e. post processing for DICE's game e.g. BF and star wars.
I dunno if they fixed it but I found with Deus Ex: MD that several of the ultra settings had some issues and actually looked better overall with high or very high for several settings because of that. Some also had very tiny differences but could cause a 10-15% framerate hit or worse made the game feel less smooth at framerates that were smooth without that setting(s).
I dunno if they fixed it but I found with Deus Ex: MD that several of the ultra settings had some issues and actually looked better overall with high or very high for several settings because of that. Some also had very tiny differences but could cause a 10-15% framerate hit or worse made the game feel less smooth at framerates that were smooth without that setting(s).
I played that recently and yup, that is probably the best game to use for show casing the above video best. Some of the ultra settings looked awful compared to very high/high and had a huge FPS hit, the worst offender being the contact hardening shadows or whatever the setting was called, shadows looked awful with that turned on...
I will put that down to the developer though as when the game first launched, a few graphical effects were missing on the PC version :/
Can't remember what setting it was but I found one that only made about 3% difference in framerate - but with it enabled needed around 50% higher framerate before the game felt as smooth as with it off :s
Has my vote for worst "My GPU can't keep up recommend me a new one" thread
Seriously though, it's always going to be diminishing returns going from high to ultra settings in most games, as the top end effects tend to be subtle. The big steps are normally low to high.
As far as your DICE reference, I think their last couple of games (SWBF and BF1) look incredible in ultra settings. That's not the reason you or I play at lower settings though, it's because we want the smoothest experience and don't want the enemy to be hidden by excess graphical frippery like excessive bloom or massive amounts of particles obscuring our targets..
Disclaimer : haven't watched the vid yet as I'm at work.
Quite a few games are like this. It is always worth tinkering with settings to see if there is difference between ultra and the one lower option. Many times the visual difference can hardly been seen apart from fps counter. Lol.
I remember people complaining about Deus Ex Mankind Divided and I would tell them, just lower or turn of this or this and they would refuse to do so, because apparently as they have a Titan X or whatever, and it should work maxed out. The fella String comes to mind
Because people feel they need to game with everything maximum they end up playing on a much lower resolution than they could be. I happily game on 4K as I am flexible and happy to play with settings. In the end I end up playing with much better IQ than those with much better graphics cards who play on lower resolutions
In all games the first thing I do is go and turn off all the useless junk effects that lower fps and make the game look worse. Things like chromatic aberration, motion blur, depth of field, film grain, lens flare etc... Those alone can likely be the difference between a 1080 and 1080ti in fps. So a free upgrade for me right there and better visuals (for me)
Has my vote for worst "My GPU can't keep up recommend me a new one" thread
Seriously though, it's always going to be diminishing returns going from high to ultra settings in most games, as the top end effects tend to be subtle. The big steps are normally low to high.
As far as your DICE reference, I think their last couple of games (SWBF and BF1) look incredible in ultra settings. That's not the reason you or I play at lower settings though, it's because we want the smoothest experience and don't want the enemy to be hidden by excess graphical frippery like excessive bloom or massive amounts of particles obscuring our targets..
Disclaimer : haven't watched the vid yet as I'm at work.
haha, feel free to send me one of your 1070s considering SLI doesn't work these days, especially in your main game, BF 1!!!!
But yes, this has been the case for quite a while now, although this is the first time I've seen a youtuber talk about it and put up a decent case for it (although he could have shown more games such as deus ex mkd), as he pointed out in the video, back with the likes of crysis, going from low to high had/has a massive impact to visuals where as these days, nowhere as much.
And yeah SWBF 1 + BF 1 look stunning with high/ultra settings, I'm just referring to the post processing setting, I find it makes the game look worse on high/ultra, iirc, it is the one that contains effects like motion blur (separate to the dedicated motion blur setting), bloom etc. Even BF 4 + 3 with high settings looks very good today although the step down to low preset is definitely more noticeable compared to bf 1/swbf going from high to low settings.
Can't remember what setting it was but I found one that only made about 3% difference in framerate - but with it enabled needed around 50% higher framerate before the game felt as smooth as with it off :s
Even if what you said is true. What the YouTube video says is very true. There was such a big difference in graphics settings before with games like Crysis, now there is very little to no difference. People just want to believe so for epeen or whatever reason. At least with a higher resolution there is actually a noticeable difference, hence why I have been happily gaming at 4K since 2014
My PC may be ancient but it still plays games extremely well with some adjustment
TBH, the only game I've had to drastically drop settings to achieve a constant 50+ FPS has been for early access titles such as ark, battlegrounds and any other open world survival type of games.
I had to drop a ton of settings for the division but with dx 12, I have been able to whack a lot of stuff to max (except shadows and AA, the 2 distance/geo. sliders went from 0/20% to 70/50% iirc) and get a min of 50 fps, with dx 11 and much lower settings, my fps dips where usually in the low 40s and sometimes high 30's.
Oh and there is rise of the tomb raider, although that game ran like **** for me in every way, by far the worst game to have ever been installed on my PC, crash galore and even when I dropped everything to low/off, it still barely broke the 50fps window. DX 12 helped massively again but still a **** show for me.
Quite a few games are like this. It is always worth tinkering with settings to see if there is difference between ultra and the one lower option. Many times the visual difference can hardly been seen apart from fps counter. Lol.
I remember people complaining about Deus Ex Mankind Divided and I would tell them, just lower or turn of this or this and they would refuse to do so, because apparently as they have a Titan X or whatever, and it should work maxed out. The fella String comes to mind
Because people feel they need to game with everything maximum they end up playing on a much lower resolution than they could be. I happily game on 4K as I am flexible and happy to play with settings. In the end I end up playing with much better IQ than those with much better graphics cards who play on lower resolutions
In all games the first thing I do is go and turn off all the useless junk effects that lower fps and make the game look worse. Things like chromatic aberration, motion blur, depth of field, film grain, lens flare etc... Those alone can likely be the difference between a 1080 and 1080ti in fps. So a free upgrade for me right there and better visuals (for me)
Yup pretty much, I use to be the same in my thought process "got to be able to max the game, I should be able to max the damn game!!!!" but with my ageing rig and just not caring as much for "graphics" these days (no point having great graphics if the gameplay sucks as I just end up alt-f4'ing after 10 minutes i.e. watch dogs 2 ). I don't even play many games now other than a handful (and they are all old games that run perfectly now any way ), I'm content with lowering the settings and still getting a good experience for these newer un-optimised titles.
Personally I am finding myself more spoilt by attention to detail and NPCs behaviour etc. in games atm e.g. AS unity, whilst gorgeous looking for the lighting, textures etc. the amount of detail put into the game world with the NPC behaviour, the sheer size of crowds etc. is far more immersive than any fancy realistic looking graphics such as SWBF/BF 1, the game world just feels alive and like you can really connect to it, tbh, unity has somewhat ruined other games for me as every other game just feels "empty/dead" now
I like to set textures to maximum. That is why these days 4gb is not enough. I was not as bothered in the past, but as I am on 4K, higher textures are welcome. 8gb seems to be fine, deals with every game so far and will likely be the case for a while I would imagine ,until PS5 comes out anyway. Or maybe games like Star Citizen which is PC only push the bar.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.