Optimised Graphics Settings

Soldato
Joined
7 Nov 2005
Posts
4,961
Location
Widnes
Hello all,

When you install a game it sets the graphics recommended to your computer spec. I know in the past installing HL2 that it sets my resolution higher than 800x600, however since upgrading my hardware and reinstalling XP all games recommend 800 x 600 with some fairly basic graphics (not the complete lowest). Bioshock and COD4 have both done the same on my computer.

My computer spec:

Intel C2D E6600
Crucial Ballistix 2GB (2x1GB) DDR2 PC2-5300C3 667MHz
520W Corsair HX PSU
Gigabyte 965P-DS3P Motherboard
ATI X1800XT 256MB

Is this probably a driver problem? Maybe something to do with how the PC would built? Or just games are crap at optimising graphics settings?

Thanks :)
 
Hello all,

When you install a game it sets the graphics recommended to your computer spec. I know in the past installing HL2 that it sets my resolution higher than 800x600, however since upgrading my hardware and reinstalling XP all games recommend 800 x 600 with some fairly basic graphics (not the complete lowest). Bioshock and COD4 have both done the same on my computer.

My computer spec:

Intel C2D E6600
Crucial Ballistix 2GB (2x1GB) DDR2 PC2-5300C3 667MHz
520W Corsair HX PSU
Gigabyte 965P-DS3P Motherboard
ATI X1800XT 256MB

Is this probably a driver problem? Maybe something to do with how the PC would built? Or just games are crap at optimising graphics settings?

Thanks :)

That last one right there. I've recently run through HL2 for the first time - it decided my 2900XT was only capable of 800x600, 0xAA, 0xAF, not even HDR! :o

With that card you should be looking at 1280x1024, 4xAA, 16xAF, probably even with HDR as well.
 
I think the trouble is that these 'auto-detect' features are simply making a guess based on basic factors such as CPU clockspeed, total system memory and video card memory. They do not have a database of every card in existence to read from - which of course would be impossible given that some video cards come out after the game went gold. So for example a 256meg 1900XT might be viewed as a worse card than a 512meg 7300GT in terms of automated settings.

Personally I always tweak my settings in any case, usually to something like this:

1680x1050x32
0xAA
16xAF
Shadows low
Most other settings on high
Audio on max

I then see how the game is performing. If it is running very well, then I'll start boosting settings like AA and shadows. If it is running poorly, I'll look at turning down lighting / SFX, dropping textures to medium and try again.

Obviously with games more than about 3 years old I may be a bit more aggressive with my initial settings.

Incidentally I seem to get a lot of that 4xAA but low res on autodetect nonsense. I reckon it would make far more sense if games defaulted to desktop resolution, now that many people are using TFT monitors which have a native res. Obviously in cases with very weak graphics card then maybe some adjustment is needed, but at the end of the day people can always turn down the res manually in any case.
 
Back
Top Bottom