Anti-aliasing appreciation thread (lol)

Why not? :confused: All of those games are logical developments of their predecessors and not linked to Crysis.. If you completely ignore Crysis's existence the rest of the industry continued to progress at a steady pace. Up to this point. I'd argue that there was no sudden spurt inline with Crysis's release, if anything the new consoles that we're released before hand would have had far more affect (good or bad..).

Why do you assume that without Crysis they would have taken another path? Crysis didn't introduce any new unique revolutionary techniques or technology, did it? It just used what had been done before ramped up and coupled with ridiculous (for the time) system requirements.

Source has had at least some new things for sure. Wasn't the Lost Coast a tech demo for some new HDR technique?

CryEngine > Unreal 3, and I can't think of any other engines off the top of my head that are used in games not created by the creators of the engines. There's more to say Crysis had more of an effect on modern graphics than anything else, Far Cry 2 & 3 are on the CryEngine.

Hypocrite. Half your argument is speculation. :p

;):D

Crysis didn't do ANYTHING new though. Crysis just had more polygons. More shaders. Higher resolution textures. More shaders. Also more shaders. It was a natural step up from existing technology but it was a step too far. It doesn't have all of the new technology and thats why it runs so badly. It did the same old things everything else did.

Other games would look just as good since they are making use of new technology, new hardware, new programming techniques.

Crysis didn't pioneer anything at all, it just did the same as everything else but had more of it.

Honestly, something like Rage with megatextures is more of a pioneer than Crysis is.

Exactly, Crysis showed the true potential of the era of gaming it was released in, it showed that 5 years ago, extreme graphics were possible. Considering it came out the same time as GTA IV yet I can run Crysis more smoothly, I wouldn't really say its some horribly unoptimized P.O.S. Especially when you see the draw distance I use.

Am I right is saying you game at 1440x900, Omaeka? That'll explain why you're getting good performance from your 6850; not that it's a bad card, you just won't necessarily get that kind of performance with games when you increase the resolution and try to keep the same high levels of AA. :)

When you get a higher res monitor, I wonder if you'll still prefer AA over resolution? Let us know your thoughts when you do. :)

That's the thing I've been saying, I'm not sure what AA + 1080p are like together, but from what was posted earlier of Battlefield 3, AA is extremely important in the graphically cutting edge games. Maybe its not so important when everything else a game does is substandard, but for a game on the level of BF3, AA is a must if you want good visuals.

I wont be running a 6850 when time comes to upgrade my monitor.

Intel H77 + Intel i5-3450 > HD7850 > 1080p monitor. :p
 
That could be done at any era though. We could do that right now, it would just be a waste of time. It's just more of the same stuff.

At no point did devs think that they needed someone to show them that graphics could become any better.
 
CryEngine > Unreal 3, and I can't think of any other engines off the top of my head that are used in games not created by the creators of the engines. There's more to say Crysis had more of an effect on modern graphics than anything else, Far Cry 2 & 3 are on the CryEngine.
There's loads. IdTech has had a far bigger influence than most the rest put together over the years, especially if you go back all the way to it's roots.

And while CE may certainly be prettier than UE, it certainly doesn't make more fun games because of it. Probably why UE has been a huge commercial sucess with hundreds of games using it compared to a fistful.
 
Crysis was the first game with Screen Space Ambient Occlusion, so graphically it was defo a pioneering game. I still think Source/HL2 has aged incredibly well and even though it looks a little limited, the engine offers a lot of crispness and clarity that some modern games don't match.
 
Crysis was the first game with Screen Space Ambient Occlusion, so graphically it was defo a pioneering game. I still think Source/HL2 has aged incredibly well and even though it looks a little limited, the engine offers a lot of crispness and clarity that some modern games don't match.

+1, saves me researching lmao :)
 
Unreal 3, ID Tech, Frostbite 2, Source... they all cut corners, whether it's texture resolution (Doom 3), lighting (Source, UE), shadows(Source), shaders (Frostbite, Unreal, ID), or map size (all of the above). You have to admire Crytek's unwavering resolve to present the most accurate simulation of reality as far as visuals and physics are concerned.

The only "cheap trick" (I say that very lightly, considering the huge performance hit) used in Crysis is POM; it's not real geometry, it's an illusion. Everything else is as close to a simulation, suited to a video game, as they could get in 2007; as demonstrated in the plethora of "mass physics" youtube videos.

Really, the game runs very well, all things considered. It wasn't a game designed to be played smoothly on maximum settings in 2007 and Cevat Yerli even said this in the many interviews leading up to the game's release. The idea was to make a game which showed what was possible on current hardware, should a developer put some work into it. Whether or not this spurred other companies into concentrating their efforts on graphical enhancements is unclear - but Crytek is now synonymous with gorgeous visuals.

Crysis was the first game with Screen Space Ambient Occlusion...

FarCry (PC) was also the first game to use HDR lighting and normal maps, the former of which Valve attempted to imitate, poorly, in HL2: Lost Coast, using a software-based solution rather than a hardware pixel shader. As I recall, this was due to being partnered with ATI at the time, and their cards being incapable of rendering hardware-based HDR correctly (Radeon 9000 series).

And while CE may certainly be prettier than UE, it certainly doesn't make more fun games because of it. Probably why UE has been a huge commercial sucess with hundreds of games using it compared to a fistful.

UE3 is only successful because of Gears of War; the engine was practically made for the 360. It runs well, it's resource-friendly and it supports enough post-processing techniques to cover the fact that both consoles can barely run most games at 1280x720.
 
Last edited:
+ Unreal 3 is fading a little now, Unreal 4 isn't going to be around for a while yet.

Go look at the amount of games in the next few years using the CryEngine 3, in a few years it will be the premier gaming engine for small and big companies alike. It already has several modified versions used in other games, such as FC2 + 3.
 
Actually, FC2 uses an uber-modified version of CryEngine 1.

Only 2 or 3 percent of the original CryEngine code is re-used, according to Michiel Verheijdt, Senior Product Manager for Ubisoft Netherlands.
Source

And FC3 is using a modified version of that engine.

I do agree though, we'll be seeing a lot more of CE3, for the same reason UE3 took off: it runs on consoles.
 
The CryEngine is by far my favorite, not just because of the visuals but because of the handling, its very very distinct, you can tell the CryEngine a mile off (Which is why I thought that FC2+3 were lightly modified, because they feel and look hell of a lot like it).

Have you seen Oblivion modified to run on the CryEngine? God if Bethesda actually use it for their next TES game, I'll just die of happiness. :p

oblivion_to_crysis4.jpg

oblivion_to_crysis5.jpg

2dgnw5x.jpg
 
Cryengine 2 still isn't a smooth running engine. Cryengine 3 though is great, runs really well.

Wasn't aware about that with regards to Crysis 1, quite interesting really. I didn't realize they had actually come up with SSAO. I suppose it's somewhat revolutionary in that regard, although perhaps not vital to current technology.

I do hope we see more of CE3 since it is a really good engine that looks nice without pushing the boundaries too far. Runs smoothly too, which I personally find to be the most important thing.

Interesting to see that little mod of oblivion in CE2. Bethesda wouldn't use it for a TES game because, well, it's been replaced by the better CE3 at this point. Also I'm pretty certain that Bethesda's in house engine is a little better than CE2 graphically. It looks pretty nice, but has the general PC port optimization issues.

My favorite engines at the moment though are definitely the unreal engine, source engine and CE3. They look great whilst running well and providing a generally smooth experience.
 
I have a 19 inch 16:9 1440x900 monitor, does that mean that games look better or worse without AA? You've confused me a little good sir. :p

Games will worse without AA on (nearly all) setups, but the higher your DPI (screen resolution to screen surface area ratio) the less necessary it becomes. Or in layman's terms, high res on small screen can get away without AA, low res on big screen you can't. 1440x900 is about average for a 19" screen so I'd expect some benefits.
 
Cryengine 2 still isn't a smooth running engine. Cryengine 3 though is great, runs really well.

Wasn't aware about that with regards to Crysis 1, quite interesting really. I didn't realize they had actually come up with SSAO. I suppose it's somewhat revolutionary in that regard, although perhaps not vital to current technology.

I do hope we see more of CE3 since it is a really good engine that looks nice without pushing the boundaries too far. Runs smoothly too, which I personally find to be the most important thing.

Interesting to see that little mod of oblivion in CE2. Bethesda wouldn't use it for a TES game because, well, it's been replaced by the better CE3 at this point. Also I'm pretty certain that Bethesda's in house engine is a little better than CE2 graphically. It looks pretty nice, but has the general PC port optimization issues.

My favorite engines at the moment though are definitely the unreal engine, source engine and CE3. They look great whilst running well and providing a generally smooth experience.

Same, I love the REDEngine though, from Witcher, so beautiful, and the 4AEngine from Metro 2033, the Creation engine from Skyrim, even if it has quite bad optimization issues (especially on AMD) and the Anvil engine from Assassins Creed (Rainbow Six: Patriots is actually using it!).

CryEngine 3 though, is the future of gaming, along with Unreal 4. Timesplitters 4 is on CryEngine 3, and is scheduled to be one of the leading games in the next generation. Bloody hell the next few years is going to be GOOD.

Games will worse without AA on (nearly all) setups, but the higher your DPI (screen resolution to screen surface area ratio) the less necessary it becomes. Or in layman's terms, high res on small screen can get away without AA, low res on big screen you can't. 1440x900 is about average for a 19" screen so I'd expect some benefits.

My monitor isn't bad at all tbf, got it for £10 from my uncle, thanks for explaining though, resolution confuses me a little! :p

What is the technical p number of 1440x900?
 
Wow, from AA to how well coded a game is.

Currently gaming at 1440p which reduces the need for AA but do tend to have some at 2x just to make it look pretty.
 
Funnily enough I never noticed the lack of AA on console games, then again all I really played was Skyrim, Rainbow Six & Street Fighter IV / x Tekken, I think they had some AA on consoles?
 
Back
Top Bottom