Why do graphics settings sometimes make no difference?

Soldato
Joined
12 Oct 2003
Posts
4,027
It seems that now i can have things on high for a given fps and turning down or off certains settings gives a small improvement, many do nothing though, in the past this was different where most settings made some difference when lowering, now i find i may as well keep things generally high, why is this?

If i had to guess it might be because we're playing mostly console ports and games don't really give more fps, the other possibility is gpu's have more dedicated parts for different tasks, so its a case of using them fully or not at all and there's nothing to be gained by not using them as they won't work on anything else anyway?
 
Over the years, on top of the expected performance increase with time, graphics cards / drivers have become specifically pipelined both hardware to meet the demands of the APIs they support and then in turn the developers pipeline to support said hardware. Still large inefficiencies are present, but not easy to get around with wide ranging hardware specs.
 
Last edited:
maybe the settings you're changing don't use much resources at all?


changing AA from lik x8 to 2x will show a good improvement though as it's quite intensive.


Although AA seems to have taken a step back of late, now we have the console style "just smear the screen with Vaseline and call it AA" instead of the old "give us half your FPS and we'll make it crisp"
 
Although AA seems to have taken a step back of late, now we have the console style "just smear the screen with Vaseline and call it AA" instead of the old "give us half your FPS and we'll make it crisp"
MSAA is the blurred type of anti aliasing (arguably better looking than no aa at little performance cost)

SSAA is the clean type. (substantially better than no aa at sometimes large performance cost)
 
Typically this is because you are bottlenecked in another area, for example reducing texture size won't help much if you have plenty of VRAM / video bandwidth to spare, meanwhile CPU is getting hammered. Simlar to your point about "dedicated parts for different tasks".

In my many years of experience PC gaming I have typically found that settings relating to shadows are one of the more sensitive ones, i.e. reducing shadows from their highest setting to medium or low will often give gains.
 
In my many years of experience PC gaming I have typically found that settings relating to shadows are one of the more sensitive ones, i.e. reducing shadows from their highest setting to medium or low will often give gains.

This is down to shadows involving rendering the scene one or more times per shadowed light source then saving that in VRAM and then sampling it usually 4-8 times per pixel affected by the light. This uses resources across the entire spectrum of the GPU. Turning down the resolution of the shadow map will help both when rendering the shadow map and also in terms of VRAM used by it. Turning off soft shadows will drop the sample count down to 1 per pixel within the lights range.

And the blurred anti-aliasing is due to more and more rendering engines using deferred rendering which is what gives you the ability to have hundreds of light sources (only a couple of which will create shadows though) and also makes SSAO and other nifty effects easier to perform. The downside to deferred rendering is that traditional anti-aliasing is not possible which means post-process AA must be used, aka blurring based on edge recognition which can be done well or hideously bad in some cases. There are some nicer AA solutions coming that are better though so hopefully more engines will start adopting these and we'll be back to the kind of AA we got in old forward renderers.
 
I have been playing Bethesda games a lot recently, Skyrim and now Fallout New Vegas, and sometimes I get poor performance but I find that texture packs, increased view distance, AA etc make no difference to performance in the slightest, but as soon as I touch shadows I can see a definite improvement if lowered and FPS plummets if they are maxed. Skyrim especially, I had to turn shaodows right down, whilst other options did nothing.
 
I found that when I had ATi cards the frame rate always dropped a lot when AA was on, while with Nvidia I tend to notice that that some of the filtering options taxes the gpu a little more than compared say with my last Radeon.
 
FXAA/Morphological AA is the blurred type of anti aliasing (arguably better looking than no aa at little performance cost)

MSAA is the clean type localised to specific areas of the screen, although admittedly some algorithms (quincunx comes to mind) ended up rather blurry..

SSAA is the clean type spread across the entire screen (high performance cost)

Fixed..or at least updated to last year :p
 
Last edited:
You can always try comparing screenshots before and after certain settings are applied.

I remember there was somone who posted here that stopped playing Skyrim as soon as they saw it was barely stressing their CPU and GPU, and rather went back to playing Witcher 2 with Ubersampling and Dragon Age 2 with ultra textures and 8x AA because those games satisfied his graph fetish a lot better in GPU and Cpu Utilization graphs :D.

Each to their own, but most people dont play games just for the graphics.
 
I have been playing Bethesda games a lot recently....

These games have a global lighting model as they have day and night so all lighting effects / shadows etc... have a dramatic impact on performance. In contrast for example, the time of day doesn't change in source engine games so they don't have to deal with this nearly as much.
 
Back
Top Bottom