Are the graphical elements of gaming starting to peak?

No graphics still have along way to go. Just look at the quailty difference between pre-rendered cut scenes and in game graphics.

A big gap, but pre-renedered stuff such as cut scenes will always be a step ahead. Its not just the gap in visual quality but the animation too (which is even more impressive imo).
 
I hope graphical development is stunted, maybe we can finally get back to games giving us good gameplay not stupid particle effects to cover up lack of story, depth and interaction.
 
How can anyone who has played Far Cry2 not be impressed by its stunning seamless graphics. The awesome terrain, draw distance, textures and shadowing are pretty realistic in terms of what we have ever seen before. But is the technology starting to peak?

When you look back on older games, such as original Far Cry, the graphics now seem some what dated which is to be expected but it wasn’t that long ago really. In those days, the games hardware requirements pretty much dominated your upgrade path and both HL2, FarCry, to name a few, really tortured pretty much everyone’s hardware at the time. However, this doesn’t seem to be the case anymore.

For example, my 2 year old graphics card can easily play all* the latest titles at their highest settings, including FarCry2. Therefore, any upgrade to newer card since would be a complete waste of time.

Surely, there has to be a point where by the games themselves have little or no more room for graphical improvement and I think we are starting to see this.

Discuss.

*excludes Crysis

__________________

Nah, games are still far from photo realistic. They have a long way to go, especially with AI and other life-like features.
 
I hope graphical development is stunted, maybe we can finally get back to games giving us good gameplay not stupid particle effects to cover up lack of story, depth and interaction.
I used to say this, but a good creative team always takes care of the gameplay. Bad developers either make good looking pants games, or pants looking pants games.

I don't think the issue is that devs spend too long on the visuals, they just don't know how to make good games.
 
Photo realism is the next step but things will progress and virtual entertainment will reach places that we would consider sci-fi nowadays.

I just heard a podcast (a serious science one) that said Japanese scientists have been able to interpret from a brain scan (of the visual cortex) what a person was looking at. It only works for clear black-on-white shapes such as a big letter, but using this they were able to tell what word the subject was looking at on a paper.

How long until you are able to reproduce your dreams on a screen, and until they can do the opposite and project the images straight into your brain? Will people be able to sit at home watching a movie thats being projected straight into their mind? Will those then become interactive if the images are controlled by a computer and you control the input devices? Will they be able to stimulate not just the visual cortex but other senses such as hearing, and touch?

We will either be very old or not around when it happens, but it probably will.
 
Games are no where near peaking at all. They're way way WAYYYYY off being anything near realistic. Far Cry 2 looks good but really isn't anything special in terms of being a game that has peaked graphically.

I'm also interested in your 8800GTX handling all settings on max settings considering my 4870 lags when everything is turned up full at 1680x1050.
 
As Kreeeee says, we're still some way off photo realistic.


But is photo realistic truely a game or does it then become an interactive movie/experience?

There surely comes a point too where the media becomes more of a interactive video than a game. The difference between what's real and what isn't is only separated by the imagination. Take that away how do we distinguish between the two?

I'm also interested in your 8800GTX handling all settings on max settings considering my 4870 lags when everything is turned up full at 1680x1050.
No idea.. I have overclocked both the 8800GTx and the Q6600. Might be different game we are playing too. Currently playing COD5 and FarCry2 of 1920 x 1200 without any problems.
 
Last edited:
Far Cry 2 doesn't look that good, for starters everything is brown.

Crysis isn't too far off photo-realism when it comes to faces, it's damn impressive.
http://www.pcgames.de/screenshots/original/2007/11/Crysis_UHQ_04.jpg

few things in crysis are almost photo realistic, faces as you said being one of them, leaves with custom SSAO and HDR are DAMN close to being photo-realistic, trees have a long long way to go since they look too jaggy, another thing realistic looking and acting in crysis is water, best looking water in a game period, also smoke looks realistic too :) tweaking your settings you can get the more stuff looking closer to real, the ground for example with SSAO and POM looks very realistic
 
Animation is also an area which needs vast improvements to gaining a realistic look. At the minute you have canned mo-cap animations and hand animated. Some games mix these and its painfully obvious when they transition from one method to the next.

Graphics though is something that is getting closer and closer with every passing year. Carmack once stated that if you throw enough texture res at something from a distance things can look very photo realistic. We're still a good bit off but carmack and others seem to think it could happen inside a decade. Considering this new RAGE engine can use literally hundreds of gigs of textures in a given scene, should be interesting to see how that pans out.
 
few things in crysis are almost photo realistic, faces as you said being one of them, leaves with custom SSAO and HDR are DAMN close to being photo-realistic, trees have a long long way to go since they look too jaggy, another thing realistic looking and acting in crysis is water, best looking water in a game period, also smoke looks realistic too :) tweaking your settings you can get the more stuff looking closer to real, the ground for example with SSAO and POM looks very realistic

Hmm I'm not too sure about your opinion there. I think if anyone took a quick glance at a face in Crysis they would instantly say it's from a computer game. Even in these massive budget Hollywood films I can still very easily spot a CGI face to a real face.
 
The issue is that games are now very expensive to develop, and the PC market is relatively small, meaning that most major titles appear on both PC and consoles. This ties PC titles closely to consoles, which have a fixed hardware set.

Yes, PC versions can be run at higher resolutions, and may have a few extra graphical niceities, but for the most part they are the same. As such, the further away we get from the last console release the better the average PC (or more importantly the high-end PC) gets relative to the graphical requirements of the game.

I think that the appearence of the next-generation consoles will bring with it a large jump in graphical quality, and also in the hardware requirements.
 
I have my own law, something like Moore's law, where what we see now in CGI renders end up being realtime in games 5-6yrs later.

Look at Crysis for instance, max settings. A typical CGI render 6 yrs ago would not be able to beat it (like for like), not talking about multi-million $ Hollywood setups, just normal PCs using 3DS Max etc.

Having been into 3D rendering for over 20 yrs now (anyone remember Videoscape 3D on the Amiga?) this 5-6 yr thing has remained pretty constant from what I've seen. So look at a good CGI now, it will be possble in games at 30+ fps in 2014.
 
The issue is that games are now very expensive to develop, and the PC market is relatively small, meaning that most major titles appear on both PC and consoles. This ties PC titles closely to consoles, which have a fixed hardware set.

.

According to valve theres around 260 million online pc gamers, console gaming doesn't come close.
 
Back
Top Bottom