Ultra settings suck!

Caporegime
Joined
4 Jun 2009
Posts
31,896
Completely agree with this video, so many games I've played over the last couple of years have barely looked any better on "ultra" than "high/very high" preset, only noticeable difference has been the FPS hit....


Personally I also find that some graphical settings on max can make a game look worse i.e. post processing for DICE's game e.g. BF and star wars.
 
I dunno if they fixed it but I found with Deus Ex: MD that several of the ultra settings had some issues and actually looked better overall with high or very high for several settings because of that. Some also had very tiny differences but could cause a 10-15% framerate hit or worse made the game feel less smooth at framerates that were smooth without that setting(s).
I played that recently and yup, that is probably the best game to use for show casing the above video best. Some of the ultra settings looked awful compared to very high/high and had a huge FPS hit, the worst offender being the contact hardening shadows or whatever the setting was called, shadows looked awful with that turned on...

I will put that down to the developer though as when the game first launched, a few graphical effects were missing on the PC version :/
 
Has my vote for worst "My GPU can't keep up recommend me a new one" thread :p

Seriously though, it's always going to be diminishing returns going from high to ultra settings in most games, as the top end effects tend to be subtle. The big steps are normally low to high.
As far as your DICE reference, I think their last couple of games (SWBF and BF1) look incredible in ultra settings. That's not the reason you or I play at lower settings though, it's because we want the smoothest experience and don't want the enemy to be hidden by excess graphical frippery like excessive bloom or massive amounts of particles obscuring our targets..

Disclaimer : haven't watched the vid yet as I'm at work.
haha, feel free to send me one of your 1070s considering SLI doesn't work these days, especially in your main game, BF 1!!!! :p :D

But yes, this has been the case for quite a while now, although this is the first time I've seen a youtuber talk about it and put up a decent case for it (although he could have shown more games such as deus ex mkd), as he pointed out in the video, back with the likes of crysis, going from low to high had/has a massive impact to visuals where as these days, nowhere as much.

And yeah SWBF 1 + BF 1 look stunning with high/ultra settings, I'm just referring to the post processing setting, I find it makes the game look worse on high/ultra, iirc, it is the one that contains effects like motion blur (separate to the dedicated motion blur setting), bloom etc. Even BF 4 + 3 with high settings looks very good today although the step down to low preset is definitely more noticeable compared to bf 1/swbf going from high to low settings.

Can't remember what setting it was but I found one that only made about 3% difference in framerate - but with it enabled needed around 50% higher framerate before the game felt as smooth as with it off :s

Texture setting? Could be down to hitting VRAM limit. EDIT: just re-read, "enabled"... hmmmm not sure which setting it was then....

Although that is another setting where there is so little difference between high/very high and ultra yet a huge increase in VRAM :/

http://images.nvidia.com/geforce-co...ractive-comparison-005-very-high-vs-high.html

http://images.nvidia.com/geforce-co...ive-comparison-001-very-high-vs-high-alt.html

http://images.nvidia.com/geforce-co...ive-comparison-002-very-high-vs-high-alt.html

http://images.nvidia.com/geforce-co...ive-comparison-003-very-high-vs-high-alt.html

http://images.nvidia.com/geforce-co...ive-comparison-004-very-high-vs-high-alt.html

No doubt there is a difference but is it worth the 3+GB extra VRAM required?
 
My PC may be ancient but it still plays games extremely well with some adjustment ;) :D

TBH, the only game I've had to drastically drop settings to achieve a constant 50+ FPS has been for early access titles such as ark, battlegrounds and any other open world survival type of games.

I had to drop a ton of settings for the division but with dx 12, I have been able to whack a lot of stuff to max (except shadows and AA, the 2 distance/geo. sliders went from 0/20% to 70/50% iirc) and get a min of 50 fps, with dx 11 and much lower settings, my fps dips where usually in the low 40s and sometimes high 30's.

Oh and there is rise of the tomb raider, although that game ran like **** for me in every way, by far the worst game to have ever been installed on my PC, crash galore and even when I dropped everything to low/off, it still barely broke the 50fps window. DX 12 helped massively again but still a **** show for me.

Quite a few games are like this. It is always worth tinkering with settings to see if there is difference between ultra and the one lower option. Many times the visual difference can hardly been seen apart from fps counter. Lol.

I remember people complaining about Deus Ex Mankind Divided and I would tell them, just lower or turn of this or this and they would refuse to do so, because apparently as they have a Titan X or whatever, and it should work maxed out. The fella String comes to mind :p

Because people feel they need to game with everything maximum they end up playing on a much lower resolution than they could be. I happily game on 4K as I am flexible and happy to play with settings. In the end I end up playing with much better IQ than those with much better graphics cards who play on lower resolutions :D

In all games the first thing I do is go and turn off all the useless junk effects that lower fps and make the game look worse. Things like chromatic aberration, motion blur, depth of field, film grain, lens flare etc... Those alone can likely be the difference between a 1080 and 1080ti in fps. So a free upgrade for me right there and better visuals (for me) :p
Yup pretty much, I use to be the same in my thought process "got to be able to max the game, I should be able to max the damn game!!!!" but with my ageing rig and just not caring as much for "graphics" these days (no point having great graphics if the gameplay sucks as I just end up alt-f4'ing after 10 minutes i.e. watch dogs 2 :p). I don't even play many games now other than a handful (and they are all old games that run perfectly now any way :p), I'm content with lowering the settings and still getting a good experience for these newer un-optimised titles.

Personally I am finding myself more spoilt by attention to detail and NPCs behaviour etc. in games atm e.g. AS unity, whilst gorgeous looking for the lighting, textures etc. the amount of detail put into the game world with the NPC behaviour, the sheer size of crowds etc. is far more immersive than any fancy realistic looking graphics such as SWBF/BF 1, the game world just feels alive and like you can really connect to it, tbh, unity has somewhat ruined other games for me as every other game just feels "empty/dead" now :(

I've noticed a trend in nexus;
It's never black enough,
It's never wide enough,
It's never ultra enough.

Such a picky gamer :D.

:D

All about the deeeeeeeeep blacks and 21.9 ;) :cool: :D

SLI does work in BF1!

Just not very well :(
I'll PM my address for you to send over one of the paper weights then ;) :D :p
 
You can honestly tell the difference between high and ultra at high res.
Can't comment about that myself but the youtuber is doing this comparison on a 4k display.

From what I've read, this is generally true for the texture settings though, especially if the textures have been designed for 4k in mind.
 
some games and engines look " cleaner " on low sometimes.bf games especially.also unreal 4 engine games if you turn off aa can .depends what type they use.

some effects are there just to hide .
AA is one of the first things I turn off now, most games for some reason have decided to drop SSAA, SMAA + MSAA in favour for blurry methods i.e. TAA + FXAA, saying that, I usually don't really notice any aliasing/shimmering issues anyway....

The only time I would ever use the blurry methods of AA are when you have extreme shimmering/aliasing going on i.e. GTA 5, if a game needs some AA and doesn't have a good AA option like SMAA then I'll look for a good sweetfx config.

Mafia 3's AA is hilarious, the highest option completely blurs the screen, it really does look like someone has smeared vaseline over your screen :o And the performance hit with it lol....



Here are some screen comparisons between games high and ultra preset:

5SIt4Ju.png

jf1Io4O.png

s8VI1cO.png

ArYTwvZ.png

kVX4QCB.jpg

wzUUtt5.jpg

QUQkdgw.png


19jMzTr.png

This one is with TAA "on" as opposed to the auto set option of "off" with the "high" preset option.

TFNuSQI.png

No need to mention which is ultra and which is high since you can tell from the FPS :p

Looking back, I probably should have done a better scenario for WD 2, maybe later!

As already been touched upon, if you were go into the settings and manually tweak the settings, you would get even better performance and very likely have a game that looks better than just sticking it to "ultra".


It's disappointing the PC gaming sector overall these days imo, especially when you look at digital foundry's PC comparison (which is a top end rig costing thousands) to a PS 4 pro, games barely look any better and certainly not worth the extra £500+ imo, the only very noticeable difference is usually the shadows.

I just wish "ultra" brought more benefit to the visuals, even if it completely hampered the performance more than current games on ultra today as at least the likes of £500+ GPUs + £300+ CPUs would be a lot more justifiable purchases then (imo).
 
Back
Top Bottom