• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AA & AF at high resolutions

Associate
Joined
23 Apr 2007
Posts
1,785
Location
Cardiff-ish, Wales
Not sure where this should be posted, so apologies if it's in the wrong area.

I've read a few comments on whether or not AA & AF are needed when gaming at 1900*1200 or higher. What's the geneeral feeling about this?

My reason for asking is that my 24" Dell runs at 1900*1200 and I'm not sure if I need to get a card that can run it with max AA & AF, or if I wouldn't notice if it doesn't use them ...
 
I say yes it's worth it but it is subjective. I have the same monitor as you and I have very sharp eyesight that tends to pick up on crisp edges. For that reason I find AA does add a touch more realism to the games I play. YMMV as they say on the internet :)
 
Before I got my 8800GTX (600/1900) I thought I could play any games (save Crysis) at highest settings (AA 16xQ / AF 16x) but this is not true. When people say highest settings they actually mean in-game but not in nV's 3D setting.
My res is 1440 x 900 and in COD4 the framerate is dropped substantially when 16xAA/16xAF is enabled (everything maxed out in game settings) in some maps with loads of trees or heavy fighting scenes. Dropping this down to 4xAA/16xAF improves FPS hugely.
I would say that anything above 1900x1200 AA/AF is probably needed much less than low res.
 
AA is helpful at all resolutions - just because you have a high res, doesnt mean you cant see those pixels and jaggies.

But a lot of newer games dont look too bad without any AA, as they have many layers of shader effects that blur out the jaggies and make lines less noticable (Crysis for example)
 
Last edited:
no, but it is needed less. when i went from 1680x1050 on a 22" screen to 1920x1200 on a 24" screen, i basically dropped most games from 4xAA to 2xAA and the games actually looked better. it was almost a penalty-free jump.
 
If your graphics card can handle it, why not? I'm at 1680x1050 and I have to use at least 2xAA - 4xAA. I never mess about with settings in the Nvidia control panel, game settings work fine for me.

At higher resolutions it's up to you really, if you can see those jaggies and it's peeing you off then enable some AA.
 
1680x1050 on a 22" has pretty big pixels that would probably need more AA tho, as ideally 1680x1050 is much sharper on 20inch screens.

Bottom line is, it depends on the DPI of your screen, not so much its res.
 
what it depends on makes no difference. the point is not the dpi, its that you need less AA moving up through the resolutions from a 22" right to a 30" 2560x1600 or whatever it is. Yes you are right about 20" screens, but then you'd need less AA to achieve the same 'quality' as you would on a 22" also.
 
Back
Top Bottom