AA/AF Difference

  • Thread starter Thread starter B&W
  • Start date Start date
anti aliasing (aa) gets rid of jaggies
aniosotropic filtering (af) makes textures sharper the further they go in the distance

You mayaswell always use 16 af as there is virtually no performance hit. AA id use 2x minimum on 1920x1200, the higher the res the less likely you notice these jaggies.

Theres a whole load of different aa techniques but i wont go into that..
 
http://en.wikipedia.org/wiki/Anisotropic_filtering

Look at the picture on the right on that link. That is an extreme difference between no AF and x16 AF. Like above there is no performance hit using it on modern hardware.

AA makes edges smoother. You also have Super sampling which makes transparent objects such as grass less jaggy.

There are so many different versions of AA but the most common is MSAA. Both companies AMD/Nvidia have different AA versions such CSAA which is Nvidia. The best thing to do is to look at the types your Graphics card supports and choose the best one which does the job but with best performance.
 
As above, it's probably best to look into AA yourself as there are quite a few different techniques. But AF... well, you'll pretty much want that as high as it'll go (usually 16x) so that textures don't get blurry as they go further into the distance. Look for some AF images on Google if you want some examples (the one on Wikipedia is quite good as well).
 
It's hard to notice a difference between AAx2 and AAx4 at 2560x1440 so I either use x2 or x8, depending on how heavy the game is on my GPU.

When I used to play at 1680x1050, it was either of the three plus Edge Detect.

Forcing AF is often recommended, bear in mind that the difference between x16 and x8 if negligible in most scenarios.
 
Very little diff between 8x and 16x AF so if it helps with fps use 8x as ur not losing any quality that is noticeable imo anyhow.

As said AF doesnt do much to fps but if it can squeeze a extra 2fps out of the game then bring it from 16 to 8. User preference really.
 
So 16X AF isn't as hard hitting as 4x or 8X AA?

Why is it that AA is greater hit on gfx performance than AF?

AF is just streaming the textures a bit further into the distance, AA you are rendering the image 4 times to remove the jagged edges or something, I've never really bothered to learn the intricacies of each technique but surely it's obvious?
 
AF is just streaming the textures a bit further into the distance, AA you are rendering the image 4 times to remove the jagged edges or something, I've never really bothered to learn the intricacies of each technique but surely it's obvious?

Generally I don't use AA/AF in games even though I game at 1024x768:p.

However I will hopefully soon move to 1080p and then can turn AF to 8x or 16X, whereas for AA I will be more careful in how high I want to turn it depending on my 5850 performance;)
 
Generally I don't use AA/AF in games even though I game at 1024x768:p.

However I will hopefully soon move to 1080p and then can turn AF to 8x or 16X, whereas for AA I will be more careful in how high I want to turn it depending on my 5850 performance;)

5850 8x/16aa mate ive got one, its a awesome card
 
I usually go for 16AF and 2/4X AA depending on how bad the jaggies are in the game. 8xAA usually equates to too much of a performance hit for what it does over 4xAA.
 
These little pictures should give you an idea of what the two functions do:

AA:



AF:



As others have said, these days enabling AF gives very little performance hit, so there is little harm in putting it at 16x in most cases. AA can still cause a fair drop in performance (and increase in VRAM usage) so adjust it game-by-game depending on how well it runs on your system.
 
I found games look totally crap with no AF or low AF settings ...

Just like in the the no AF picture Duff-man posted..



I don't mind no AA or low AA settings...But gaming with no AF or low AF settings is not an option for me
 
Last edited:
I found games look totally crap with no AF or low AF settings ...

Just like in the the no AF picture Duff-man posted..



I don't mind no AA or low AA settings...But gaming with no AF or low AF settings is not an option for me

Agree, same here also. I can "survive" with no or little AA but AF is essential. I have it forced in the nvidia control panel so I get its effect in all games, which isn't exactly ideal but it works none the less.
 
Just noticed someone in this thread saying that the higher resolution used the less noticeable jaggies are. That's just not true, it's to do with dot pitch/dpi. A 19 inch screen at 1440x900 will look a lot less jagged than a 50 inch plasma TV at 1080p...
 
Just noticed someone in this thread saying that the higher resolution used the less noticeable jaggies are. That's just not true, it's to do with dot pitch/dpi. A 19 inch screen at 1440x900 will look a lot less jagged than a 50 inch plasma TV at 1080p...

In general that's sound advice, remember a Plasma display tends to have a bit of bleed on the pixels compared to an LCD making aliasing less noticeable.
 
Just noticed someone in this thread saying that the higher resolution used the less noticeable jaggies are. That's just not true, it's to do with dot pitch/dpi. A 19 inch screen at 1440x900 will look a lot less jagged than a 50 inch plasma TV at 1080p...

In general that's sound advice, remember a Plasma display tends to have a bit of bleed on the pixels compared to an LCD making aliasing less noticeable.

why would there be less jaggies on a monitor that you typically sit a foot away from, compared to a 50" screen that the average person would sit 8-10ft away from?

That's the wrong way round, if anything there would be less jaggies on the 50" panel a) because its a higher resolution anyway and b) the average person sits further than which is considered optimum for that panel size and resolution (about 5-7ft). This means the monitor would naturally be the 'sharper' of the two displays and consequently because of this and the lower resolution, jaggies would appear worse.

All i can tell you is that this is exactly the case when comparing my 50" tv to my 24" monitor. the tv, despite being a lower resolution (1080p versus 1920x1200 on the monitor) looks smoother than my monitor on any given setting. ask people who play their consoles on monitors - they'll tell you the same thing. how far you sit from the display plays a much bigger role than the dpi of the panel.
 
If you want to be pedantic then it should be pixels per percepted inch then. Or for practical purposes measure the display sizes in hand widths when you hold your arm out fully in front of your eyes. My 22 inch display about 2 feet away is about 3 hand widths whereas the 32 inch TV downstairs which is 8-10 feet away would barely be over one hand width if that.

I believe Zefan was talking about being equal distances from the displays in which case he's correct, the larger the pixels in real world size the larger the jaggies.
 

I never said that viewing distance wasn't a factor. What I said was that a higher DPI was what causes jaggies to be less noticeable, not merely a higher resolution.

The reason I said "50 inch plasma TV" was merely an example of a bigger, yet higher res display. The example would have worked with a 19 inch monitor at 1440x900 and a 27 at 1920x1080, and that's the point I was trying to make.
 
Back
Top Bottom