Graphics

Soldato
Joined
26 Dec 2009
Posts
9,700
Location
North
Have been playing through Metro last Light over the last few days, a game which is regarded as one of the most graphically intensive games available today. No doubt it's a good looking game, and runs fairly smooth, but.

Maybe it's just me but I can't see almost 10 years of graphics card advancements and £100's difference, going by the visuals alone. Yes it's crisper, it runs at double the frame rate but the visual difference between 8-9year olds consoles and a high end PC is just meh.

For example, a comparison video.

http://uk.gamespot.com/metro-last-light/videos/metro-last-light-graphics-comparison-6408373/

Now i'm running a fairly high end rig, GTX 670 clocked to hell and back, Ivy bridge, lots of RAM, big SSD, around a £1000 machine. Yet a £100 8 year old console is not that far behind in regards to visuals.

My question is, what gives? Why am i not being blown away with around 20x the power of a 8 year old console (tech inside even older). For example we went from PS1 in 1994 to gears of war in 11 years, yet in 10 years in graphics advancements were barely seeing improvements at all apart from being slightly sharper?



 
Drivers/APIs really hold back the grunt of desktop graphics, whereas consoles can fine tune ever last drop of performance out of their hardware. Look at what is being managed by dinky little mobile phone gpus.

Next gen should see some nice advancements.
 
Higher Resolutions is one main advantage. You try blowing up that GOW to 1080p and watch how shocking it looks. Etc

I think gears is 720p native. I game @ 1080p on a 50" Pana GT50 and there is a difference between 1080 and 720, it's nothing more than being slightly sharper.

My system barely does 60fps on metro @ 1080p
 
If you go back to a decent looking game in ~2003 you will definitely see the difference in graphics, with the possible exception of crysis (2006?) since that couldn't run maxed out on anything back in the day. Most of us remember graphics as being a lot better than they actually were for older games and get a bit of a shock when we see them. I remember that far cry was supposed to have very good graphics in its day, but when i fired it up on my dads new computer at maximum settings it really does show its age.

I'm not sure about this next point, but i would have thought that another problem is that we are getting to the point where the graphical benefit of adding more polygons is diminishing. Back in the PS1 era the number of polygons developers had to play with was miniscule, so there were hard edges everywhere. There isn't really an exact date, but probably around the time the latest consoles came out we had got enough graphical power so that talented artists could make everything reasonably lifelike within the polycount budget. The same is true for textures, but probably a few years later. Because of that, making something "twice as good" polygon and texture wise doesnt really make a big difference these days but if you gave something from the PS1 era twice the polygons and texture resolution there would probably be a noticeable difference

You guys have already covered resolution but one thing I will add to that is that i game at 1360x768 because i have a very old monitor, and the performance difference is massive compared to benchmarks with 1080p screens so presumably upping the resolution takes a big part of the perfomance budget away from developers
 
You realise the power of your machine doesn't magically change the way they game was designed right?

Anyway, the thing you have to remember is that visuals and what a computer can render etc isn't just determined by how realistic a face looks, its lighting, particle effects, texture resolution, everything, also with the massive disparity between gaming PCs and the constant that is consoles, most developers will concentrate on having a base line which can be slightly improved on depending on your system. If they just developed their games for the top of the top end of gaming PCs then I'm sure we would see some absolutely stunning work, unfortunately they just can't realistically do that, remember Crysis? That kicked the ass of at the time top end PCs and as a result it really shut out a lot of gamers, myself included, from playing it at anywhere near high settings.

The result of this is games like Bioshock where they go for a good looking but stylistic art design choice rather than a photo-realistic one, it allows for really striking visuals that don't break the necks of scrawny systems but still makes people sit back and say "Damn this looks good". Even Metro does this to an extent and to be honest I prefer it, photo-realism is boring.
 
Last edited:
Have been playing through Metro last Light over the last few days, a game which is regarded as one of the most graphically intensive games available today. No doubt it's a good looking game, and runs fairly smooth, but.

Maybe it's just me but I can't see almost 10 years of graphics card advancements and £100's difference, going by the visuals alone. Yes it's crisper, it runs at double the frame rate but the visual difference between 8-9year olds consoles and a high end PC is just meh.

For example, a comparison video.

http://uk.gamespot.com/metro-last-light/videos/metro-last-light-graphics-comparison-6408373/

Now i'm running a fairly high end rig, GTX 670 clocked to hell and back, Ivy bridge, lots of RAM, big SSD, around a £1000 machine. Yet a £100 8 year old console is not that far behind in regards to visuals.

My question is, what gives? Why am i not being blown away with around 20x the power of a 8 year old console (tech inside even older). For example we went from PS1 in 1994 to gears of war in 11 years, yet in 10 years in graphics advancements were barely seeing improvements at all apart from being slightly sharper?




have you actaully seen metro on console???

screenshots comparison don't show the real differences.
 
If it wasn't for the locked hardware of the consoles, Metro 2 and Crysis 3 levels of graphics would have been here 3+ years ago, and the graphics of 2013 would be... well tbh I'm not even imaginative enough to picture it but you get the point. I would say graphics have improved a lot though despite that, but there is a limit to what graphics can be like, and 10 years ago we were nowhere close to it. However now we are so improvements would have been a lot slower even without consoles, and are going to be even slower with.
 
I always find the screenshot comparissions to be pretty meh too, but when you play the game on both the differences become night and day. I've played BF3, Skyrim and Sleeping Dogs recently on both my Mid Range PC and PS3 and all 3 look like a different game over the console version.

When I look at one of those websites that do a like for like, or even a Youtube video that splits the screen 3 ways (for PC, 360 and PS3) the differences look minimal at best.
 
Have you actually seen a console (PS3/Xbox 360) game? They look dreadful, especially if you get anywhere near the screen. If you had to play a xbox 360 game on a monitor and up close like you often do with a pc, your eyes would bleed.

Pictures and yourube videos are worthless as you are looking at a tiny compressed video or picture, not a real time game running on a 42inch tv for example.
 
Yet a £100 8 year old console is not that far behind in regards to visuals.

Tend to disagree here, 90% of the time I game on my PC at a desk with a 23" Monitor, I also have an HDMI switch that connects my PC, PS3 and 360 to a 37" LED TV.

I did a comparison test a while back and played the same games on my PC and consoles at the same time, then switched channels on the HDMI switch to compare the visuals both static (not moving or playing) and then whilst playing. This was kinda difficult as you dont have any picture for a moment when the switch is changing channel but I can assure you there is a massive difference in games between PC and current gen consoles. The biggest thing that struck me is the lack of AA in console games. There are either a lot more jaggies or the image is very blurred to hide the jaggies. I think this is because AA takes a lot of processing power that consoles struggle to handle.

In terms of texture resolution, I dont recall but Id bet my last rolo that the PC still has the upper hand here. I just know that the whole image quality from the PC was much better, more clearer and more crisp.

As for Metro Last Light, I think the IQ of that game is awesome on PC - I havent played it on console.

Something to remember also, not all 'modes' on a game on console has the same IQ, for example Gran Turismo 5. It looks amazing in Photo Mode/Showcase Mode, awesome detail and in 1080p I think but when actually playing the game, not just gawping at a car in a pre-rendered sequence it looks a lot worse.
 
That Capcom new game with the dungeon and dragon is said to run at 30FPS on a gtx570 and Agni's Philosophy at 60fps on on gtx680. Infiltrator again, 60fps.

When you have software that's transforming the quality of the image when it takes a screen shot, you don't really have a "fair" comparison.

Also, compare ArmA 3, Star Citizen, Stalker with Misery 2.x, Crysis 3 and so on with a console game.
 
Have you actually seen a console (PS3/Xbox 360) game? They look dreadful, especially if you get anywhere near the screen. If you had to play a xbox 360 game on a monitor and up close like you often do with a pc, your eyes would bleed.

Pictures and yourube videos are worthless as you are looking at a tiny compressed video or picture, not a real time game running on a 42inch tv for example.

yea every time I see my son playing blackops 2 or whatever on his xbox360 I always think how horrid the graphics look.

he doesn't care though and neither do most other people
 
I always find the screenshot comparissions to be pretty meh too, but when you play the game on both the differences become night and day. I've played BF3, Skyrim and Sleeping Dogs recently on both my Mid Range PC and PS3 and all 3 look like a different game over the console version.

When I look at one of those websites that do a like for like, or even a Youtube video that splits the screen 3 ways (for PC, 360 and PS3) the differences look minimal at best.

Yep, and frame rate is everything! Going from playing a game at 60fps to gaming at 30fps (and at times struggling to even maintain that) is horrific!
 
I hear you. Just built myself a new rig with 4770k and GTX 780, playing games in 1440p and yet I see these PS4 vids and think, was it really worth it? My rig cannot play some games at 60fps at that res. Still, I prefer the PC gaming community and mods make it worth it.
 
I can only comment on the recent games I've played, but if you take this video for example

http://www.eurogamer.net/videos/sleeping-dogs-face-off-ps3-vs-pc-video

At first glance the game looks similar and I can see your point of view regarding the prices/advances of PC hardware vs differences of the 2 platforms, but I have the game on PC and PS3 (thanks PS Plus) and trust me those video's don't tell half the story.

As apposed to a direct comparison try this. On the video above cover your hand over the PS3 version so you can only see the PC game running for 20 seconds or so, then swap.
 
I can only comment on the recent games I've played, but if you take this video for example

http://www.eurogamer.net/videos/sleeping-dogs-face-off-ps3-vs-pc-video

At first glance the game looks similar and I can see your point of view regarding the prices/advances of PC hardware vs differences of the 2 platforms, but I have the game on PC and PS3 (thanks PS Plus) and trust me those video's don't tell half the story.

As apposed to a direct comparison try this. On the video above cover your hand over the PS3 version so you can only see the PC game running for 20 seconds or so, then swap.

Just watched, whilst there is a clear difference in sharpness and better IQ, just not 10 years worth and £100s of pounds more. id expect that going from a GTX 580 to a 680, not a 7600GT.

When you consider the PS3 version is running on a gimped 7600GT it make you wonder.

I agree about the frame rates, I cant stand 30fps anymore but I expect more than an extra 30fps and a crisper image given the power/price gulf between the two.
 
have you actaully seen metro on console???

screenshots comparison don't show the real differences.

True, I watched the video of the comparison which obviously isn't 100% considering the loss in compression. But I have played it maxed out on my PC (no SSAA of course) and like I said there's not much in it say against something like the Last of Us (the last console game I played)
 
I hear you. Just built myself a new rig with 4770k and GTX 780, playing games in 1440p and yet I see these PS4 vids and think, was it really worth it? My rig cannot play some games at 60fps at that res. Still, I prefer the PC gaming community and mods make it worth it.

I still prefer PC simply because of the price of games and better FPS, the community not so much as I find it a bit of a chore getting online compared to say starting up MV3 and jumping straight into an online match with mate or strangers.
 
Just watched, whilst there is a clear difference in sharpness and better IQ, just not 10 years worth and £100s of pounds more. id expect that going from a GTX 580 to a 680, not a 7600GT.

I think the huge leaps in gaming visuals are behind us though . I'd expect similar results when you see video's comparing the same game from PS3 to PS4.
It's not a PC specific thing. In 2 months time you'll see games released on both consoles and the difference will be more like that video I posted than what changes you got from 1995 to 2005.

Going from a Spectrum 48k to an Amiga is nothing like going from a 360 to an Xbox one, regardless of the number of years.
 
Back
Top Bottom