Anti-aliasing appreciation thread (lol)

If you compare GTA IV without AA to GTA IV with forced FXAA or SMAA, the difference is gigantic, the amount of jaggies are reduced completely at a casual glance, its really only when you pay attention to each thing in detail that you notice the small amounts of jaggies still there, which you would never do when playing normally. AA is important dudes, and its less of a FPS hit than a lot of things such as HDR lighting or SSAO.

Depends on the game though, Skyrim is perfectly playable with the vanilla FXAA, which doesn't match up to 2 or 4x MSAA, but still looks decent. It isn't so important you cannot play without it at the best it can be, but it is important enough, in my opinion; to warrant a minimum of 2x MSAA or FX/SMAA.
 
I like AA on in games though it doesn't have to very high. As long as it is good enough then that's fine. It's the one and possibly only setting I'm willing to compromise on in the long term. I've been using FXAA more recently, though in all fairness I haven't noticed a big frame rate hit with other kinds of AA employed when using my 480. The only bug bear I have with AA is, regardless of gpu vendor, that sometimes the in game AA is rubbish and then you have to use more than the gpu control panel to rectify it, so something like a third party application. I guess we have consoles to thank for that, though I could be wrong. Bottom line...other settings and resolution over AA for me, but it's nice to have everything enabled. :D

I rarely use motion blur since normally I don't like it, and screen tearing almost always doesn't bother me unless it's very persistent / strongly evident.
 
Last edited:
I like AA on in games though it doesn't have to very high. As long as it is good enough then that's fine. It's the one and possibly only setting I'm willing to compromise on in the long term. I've been using FXAA more recently, though in all fairness I haven't noticed a big frame rate hit with other kinds of AA employed when using my 480. The only bug bear I have with AA is, regardless of gpu vendor, that sometimes the in game AA is rubbish and then you have to use more than the gpu control panel to rectify it, so something like a third party application. I guess we have consoles to thank for that, though I could be wrong. Bottom line...other settings and resolution over AA for me, but it's nice to have everything enabled. :D

I rarely use motion blur since normally I don't like it, and screen tearing almost always doesn't bother me unless it's very persistent.

Well 2x MSAA or FX/SMAA don't really effect framerate on anything I've played, then again highest I've pc'd at on my rig is 1440x900.

Good article on pc gamer a while ago on why resolution is much more important so we don't need forced AA

That's opinion, regardless of where the article comes from. In my opinion, cannot play without it, but then again I haven't gamed in true 1080p yet, only upscaled on console. So I don't know the true effects of no AA in 1080p, could somebody post a couple of screenies in 1080p?

Can you allow that mods? If they take the pic and then downscale the picture to 1024 width, you can't really tell.

Then again you could just post the link to the image. :) Thanks.
 
AA isn't too important unless it's a game with lots of foliage and trees, jaggies tend to really stand out on those.

FXAA is virtually free nowadays, there is no excuse for a game not to have it on consoles. We're fast approaching the day where virtually every single console release now has FXAA, and a lot of PC games use it by default. Skyrim and Diablo 3 are two good examples.
 
Well 2x MSAA or FX/SMAA don't really effect framerate on anything I've played, then again highest I've pc'd at on my rig is 1440x900.

To be fair I haven't used any AMD cards since my ATi 4870X2, but in the past I've very often found that this was one area in particular where nVidia gpu's provided better performance (including when high levels of AA are / were selected).
 
Last edited:
MSAA is rubbish - it uses far too much GPU resources and doesn't even work on alpha surfaces.

Hardly anyone uses it anymore, and there's a reason AMD and Nvidia are focusing so much on other AA methods.
 
To be fair I haven't used any AMD cards since my ATi 4870X2, but in the past I've very often found that this was one area in particular where nVidia gpu's provided better performance (including when high levels of AA are selected).

My 6850 can run Skyrim on 8x MSAA, Metro 2033 on 4x MSAA and AAA, ArmA II on 'very high' AA with barely any difference to them being turned off. I want The Witcher 2 or TERA to see how it copes with 16x MSAA.

Best GPU low-high end: AMD
GPU very high end (£500+ cards): Nvidia
CPU all around: Intel

:p
 
My 6850 can run Skyrim on 8x MSAA, Metro 2033 on 4x MSAA and AAA, ArmA II on 'very high' AA with barely any difference to them being turned off. I want The Witcher 2 or TERA to see how it copes with 16x MSAA.

Best GPU low-high end: AMD
GPU very high end (£500+ cards): Nvidia
CPU all around: Intel

:p

I'm surprised it copes as well as that. *Tips hat*. :D

I certainly agree with Intel being the number one for cpu's, but then I don't think we are alone there. :)
 
Well I'm running the FX4100 AMD 3.6, so as a user I can comfortably say AMD are quite bad with CPU's at the minute, so who else is there to use? :p
 
Well I'm running the FX4100 AMD 3.6, so as a user I can comfortably say AMD are quite bad with CPU's at the minute, so who else is there to use? :p

I don't think AMD cpu's are no good, they just can't compete with Intel in a majority of situations. The Phenom 2's were fairly decent on the whole. :)
 
BF3 @ 1920x1080
4x MSAA + FXAA vs. Jaggies

#1: AA
#1: No AA

#2: AA
#2: No AA

#3: AA
#3: No AA

Good work mate! The difference is pretty undeniable there, no AA completely cheapens he look of the game and downgrades it as a whole.

I don't think AMD cpu's are no good, they just can't compete with Intel in a majority of situations. The Phenom 2's were fairly decent on the whole. :)

AMD are good on the lower end of the spectrum, and that's what they have always been good at; bringing you good hardware for the poor man's budget. :p

On the higher end of things, though; you'll be needing a strong i5 or i7 (preferably a strong i5) for high end gaming like The Witcher 2 etc.
 
AMD are good on the lower end of the spectrum, and that's what they have always been good at; bringing you good hardware for the poor man's budget. :p

On the higher end of things, though; you'll be needing a strong i5 or i7 (preferably a strong i5) for high end gaming like The Witcher 2 etc.

Agreed.
 
There's no denying better graphics helps to immerse you in the game, I feel. Something like this:

http://i.imgur.com/MTBwr.png

** Please resize image to no more than 1024px wide before replacing image tags - thanks **

Just completely wrecks your ability to suspend your disbelief.
When no effort is made in the graphics department, it seriously knocks the game as a whole.
No prizes for guessing what game it is.
 
There's no denying better graphics helps to immerse you in the game, I feel. Something like this:

http://i.imgur.com/MTBwr.png

Just completely wrecks your ability to suspend your disbelief.
When no effort is made in the graphics department, it seriously knocks the game as a whole.
No prizes for guessing what game it is.

It looks pretty bad, no idea what game it is though! :p

Tbh same for me, but I would put Realism on the left too.

Tbh I really don't mind graphics, as long as they aren't lazy-bad.

Well graphics can do a lot, to me Far Cry 2 and Crysis are two of the best shooters out there, and its a lot to do with the Crytek engine, although Metro 2033 is the single best shooter, and is also the most impressive looking game to date.
 
It looks pretty bad, no idea what game it is though! :p



Well graphics can do a lot, to me Far Cry 2 and Crysis are two of the best shooters out there, and its a lot to do with the Crytek engine, although Metro 2033 is the single best shooter, and is also the most impressive looking game to date.

Nope. The graphics don't help Crysis at all. That game would be a lot better in a well optimized engine like Source that ran smoothly. All the graphics did was gain Crysis attention, but that attention didn't take into consideration the depth of that games.

Valve did graphics right in HL2. Runs smoothly on machines then, still does today, looks pretty good.

Other games that do (fancy) graphics right are games like BF3 and JC2. It's all well and good having your fancy graphics but it's worthless if the game doesn't run well. Those games both look spectacular but run well.

Games like Crysis and Skyrim are just clunky.
 
Nope. The graphics don't help Crysis at all. That game would be a lot better in a well optimized engine like Source that ran smoothly. All the graphics did was gain Crysis attention, but that attention didn't take into consideration the depth of that games.

Valve did graphics right in HL2. Runs smoothly on machines then, still does today, looks pretty good.

Other games that do (fancy) graphics right are games like BF3 and JC2. It's all well and good having your fancy graphics but it's worthless if the game doesn't run well. Those games both look spectacular but run well.

Games like Crysis and Skyrim are just clunky.

Half-Life 2 has never been a game to consider when it comes to impressive visuals. :confused:

Skyrim and Crysis run fine for me at 30-40fps and look better than a lot of games that are being released to this date. Max Payne 3 comes to mind.
 
Back
Top Bottom