• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Does Anti-Aliasing make a difference?

Associate
Joined
2 Dec 2009
Posts
663
Location
Bath
basicly the title because i was wondering if AA makes any difference at all at the max resolution of a HD monitor. for example, with a nice 1920x1080 monitor and playing, say, BC2 at max settings and max resolution, how can AA make anything smoother without splitting pixels in half? or am i being an idiot? i dont know much about the advanced graphics settings, having lived with a crap laptop for years.

cheers,
Jake

Oops: i only know that antialiasing can smooth out at lower resolutions.
 
Antialiasing will smooth out at any resolution. It doesnt split any pixels in half.

Think of it this way. You have your monitor with 1920x1080 (can't multiply numbers that big in my head) pixels in total. Now you want to apply Anti-Aliasing to the image that appears on your monitor.

So, take one of those pixels on your monitor (just one's all we need). Then ask the question, "what would happen if i rendered the area covered by this pixel at (say) 4x the resolution?"

So, within the area of your pixel, we would have 4 separate colours being displayed. That's a problem! A pixel can only display one colour at a time. We could just 'pick' one of these colours and be done with it, but that wouldnt look very realistic on the monitor and you may end up back where you started, with edges looking jagged.

Instead of that, we take what is quite literally an 'average' of the colours. Blending or 'blurring' them together produces a new colour, which we will tell the monitor to display on the pixel.


So it's not so much splitting pixels in half as taking an average of the colours within the pixel's area.



EDIT: Maybe that wikipedia link is more useful than this explanation :D
 
well i couldnt be bothered to read all the way through wikipedia to get the answer (lazy i know :rolleyes:) and your explanation was really good! Definitely helped me understand about it thanks :D
 
It does depend on pixel density, the specific game, and personal preference. Plus, if you're in a fast-paced game, everything on the screen might be constantly moving so the jaggies might be harder to spot.

On my 24", 1920x1200 screen, I'm hardpressed to notice any difference above x2 AA. That said, I do tend to run at x4 AA, just because I can - even though I don't actually need it.
 
Anything with less than 4x AA looks crap. I don't understand how this question could come up.

thats a bit opinionated dont you think? i asked this question because i didnt know what happens with antialiasing at max resolution. not because i wanted opinions on what looks crap and what doesnt.
 
Last edited:
I think it depends on the game, I've run games from 1280x1024, 1680x1050, and 1920x1200, and everytime I've seen that in some games you notice they're quite jagged without it, and others I don't see a difference at all.... Mostly I don't notice it, probably to do with playing the game rather than sitting and staring at a single image of it.

Crysis for instance is one where I really don't think it's necessary at all... either that or the drivers are doing it to a certain degree and i've never realised lol
 
AA will make a difference at any resolution, that difference becomes smaller and smaller as resolution increases (provided pixel size remains the same) :)

http://en.wikipedia.org/wiki/Anti-aliasing

Try it out in game and see if you the difference you notice is worth the performance hit.

If the pixel size stays the same, then you're just going bigger and bigger with the size.

What you want is smaller pixels, the smaller the pixel the less antialiasing you need.
 
Aliasing is an artefact which stems from the way computer graphics are drawn on the screen. To remove it, you either have to display your graphics differently (vector displays, for example) or find a way to minimise the effect of aliasing via software tricks. AA makes good sense and it does have an effect, especially if we are talking about applying it to transparent textures or doing it via edge detection. But since AA is an optical illusion (most graphics techniques are, to an extent), how much effect you spot is purely dependant on your eye and to a smaller degree, your hardware.
 
If your kit is struggling please turn off aa. It will be just as enjoyable if gameplay is good. I get fed up with reviews which test midrange kit with Crysis on 6x AA then complain it's unplayable.
 
I always turn AA off if I'm gaming at 1680. My rig just isn't powerful enough for it. I've been gaming for decades now, and a few jaggies isn't going to bother me :p

AA of course will make things look smoother - but whether it's 'essential' for any given resolution a matter of personal opinion. Me, I'd rather have a cheaper rig and no AA, and worry about frame rates instead, which do matter to me.
 
Unless you're being really pedantic, or in a very few very select games anything over 4xaa is barely noticeable in quality difference.

When Anandtech zooms in a 1cm square area and blows it up to a 5 inch box and shows the difference, yeah its noticeable, just about. In real gaming in all but a handful of games that have lots and lots and lots of diagonal lines all over the place 2xaa will make the biggest difference, 2 to 4xaa is a far FAR smaller quality jump and 4-6 or 8xaa will barely show a difference when you're at 1920x1200 and above.
 
Back
Top Bottom