Anti-aliasing appreciation thread (lol)

Soldato
Joined
10 Apr 2012
Posts
8,982
Seriously, anti-aliasing has to be the most important graphical feature, its so obvious when you play games like GTA IV & Borderlands, they are so ugly its unreal even at 1080p without some kind of forced anti-aliasing. The drawbacks of not having any is worse than ASF, no wonder developers are going all out to advance it, ubersampling on the REDengine, 16x CSAA with 16x MSAA in the works, FXAA or MLAA to provide anti-aliasing on lower end machines etc etc. I've heard a lot of people say that its worth turning it off in order to run a game at 1080p if your hardware isn't up to scratch, I honestly have no idea how they can think that. I'd rather run 1280x720 with AA on than 1920x1080 with it off, any day of the year. Any developer who chooses to not have it in their games are complete morons, Gearbox being the worst, Cel-shaded without AA? Biggest. graphical. oversight. ever.

Incase people don't know about it (I only just found out about it myself) there is something that forces AA into your games; the FXAA found here. Probably a pointless thread as people seem to be able to game without being bothered by jaggies (my hats off to them, I am jealous) but I hope there are some people in the same boat as me! :p
 
I game at 1920x1080 on a 21.5" monitor and most games look sharp with no AA enabled.

I sometimes disable AA in a few games as my spec is a bit oldhat and it looks good to me.

Dunno, I'd play Microsoft Excel if the gameplay was good I'm not bothered about good graphics.

To be serious for a moment the oldest game I play is UT 99 and couldn't give a hoot about the graphics infact I think it looks quite good!
 
I never noticed any issues on consoles with AA though, but then again I didn't even know what it was! :p

As for graphics, no, I don't care either, I can play Deus Ex 1, the old Hitman titles, the old Resident Evil titles with no issues and enjoy it as much as anything, but when you have a game that has really really good, modern graphics but no AA, it seems that all the other features (shaders, lighting, textures etc) really make the lack of AA stand out, have you played GTA IV? Perfect example, the games still a looker with FXAA, but one of the ugliest titles out without it.
 
I've heard a lot of people say that its worth turning it off in order to run a game at 1080p if your hardware isn't up to scratch, I honestly have no idea how they can think that. I'd rather run 1280x720 with AA on than 1920x1080 with it off, any day of the year

The reason people suggest this is probably down to native resolution; on some screens with 1080p native res that will likely look better than 720p with AA because you retain 1:1 pixel mapping without having to shrink the picture size. Personally I would only lower screen resolution as a last resort and would drop to a quarter of the resolution (960x540) if possible so there are 4 screen pixels for every game pixel and thus avoid warped images. Obviously for CRT there is no such problems.

I've always felt that AA is largely a substitute for insufficient resolution to screen size ratio (if pixels were small enough we wouldn't be able to see the jaggies), however realistically we aren't going to get consumer-grade screens that do high enough resolution for a given screen size (I'm talking 8000x4500 maybe) so AA is a useful tool. This probably explains why Eames isn't too fussed on a 21.5" screen as he will have a much higher DPI than say a 27" 1080p monitor. Personally since Nvidia brought out FXAA I've been using that, as it gives a nice compromise of quality vs performance hit. After shadows AA is usually one of the first settings I drop off if I'm struggling framerate wise, unless it is a game where they are particularly noticeable.
 
Last edited:
AA's over-rated

Don't you game @ 1660x800 or something? Are you blind to the jaggies or what? :p

The reason people suggest this is probably down to native resolution; on some screens with 1080p native res that will likely look better than 720p with AA because you retain 1:1 pixel mapping.

I've always felt that AA is largely a substitute for insufficient resolution to screen size ratio (if pixels were small enough we wouldn't be able to see the jaggies), however realistically we aren't going to get consumer-grade screens that do high enough resolution for a given screen size so AA is a useful tool. This probably explains why Eames isn't too fussed on a 21.5" screen as he will have a much higher DPI than say a 27" 1080p monitor. Personally since Nvidia brought out FXAA I've been using that, as it gives a nice compromise of quality vs performance hit. After shadows AA is usually one of the first settings I drop off if I'm struggling framerate wise, unless it is a game where they are particularly noticeable.

I have a 19 inch 16:9 1440x900 monitor, does that mean that games look better or worse without AA? You've confused me a little good sir. :p
 
Last edited:
I never noticed any issues on consoles with AA though, but then again I didn't even know what it was! :p

Do they even have it? They all look so ugly round the edges...

And I don't care about the graphics, I actually play spreadsheets! (Aurora :p) and DF too, it's just when the game has nice graphics but it really rough around the edges that annoys me!

cant say I agree with that , I wont play any game without some kind of AA

Same, I can't play Arma II with it any lower than the "Very High" (basically the medium amount) of AA!
 
I always use 2x MSAA over anything if its there, as I cant at all tell the difference between 2x, 4x, 8x, 16x, ubersampled, supersampled, ML'd,MS'd, FX'd etc. :p


Yea 1680x1050, screen tearing and 'jaggies' dont bother me one bit, never have, never will :p

Yet the motion blur in Dead Island bothers you? :p

Still, I'd trade places, so sick of not being able to game properly without AA, screen tearing isn't an issue though.
 
I can stand motion blur, but in dead island it is way over the top but normally I turn that off unless its done well.
 
Not sure of a game that has done it well tbh, except maybe Skyrim. So hope Dead Island is cheap in the Steam Autumn sale :rolleyes:.
 
Motion blur in games does my head in, that gets turned off strait away.

Normally I play with 2x or 4x AA running at 1920x1080 on 23" monitor.
 
For new games I most often use just 2x of the best AA available, older or well-optimised games I'll put it up to 4x. 1680x1050 here, later this year upgrading to 1920x1080/1200.
 
Last edited:
I play xbox 360 and PC, and I much prefer true 1920*1080 24" on my PC with no AA, compared to the xbox's 720p game, upscaled, + 2xAA. It's really noticable on the same screen. Jaggies don't bother me, I've always prefered 60fps with no AA compared to 40fps with some AA :D I guess that makes me a framerate whore rather than AA whore :P
 
I'm happy with 2x - 4x AA. 24" 1080p screen. Running on a 580, though, so unless it's something like Metro 2033 I just put everything as high as it'll go and forget about it.

Motion blur is always one of the things that gets turned off for me, too. Can't stand it!
 
With 1080p I don't feel AA is that important. It is usually the first setting I drop down to improve fps if needed. I also find that with the majority of games 2xAA is all that I need.
 
I don't really think AA does much in most games, its always the setting I drop down before anything else. Motion blur has to be one of the worst things ever though :p
 
Back
Top Bottom