Why is the tech slowing down?

Global illumination is the next big thing in graphics, it's basically an algorithm that represents how light interacts with objects, creating shadows, caustics and other cool lightning phenomena, it's known how to do it however it's estimated it could take up to 100x the processing power to truly show it in all it's glory giving animated movie like photorealistic graphics. Games can be partially GI'ed however. We are in a sense at a plateau. I will try and dig out a Tim Sweeney video for you, he addresses pretty much what you are asking.

True real time GI isn't coming along any time soon. The processing power required to simulate individual light particles is immense. Pretty much all current real time rendering techniques use approximation to overcome processing limitations. You can still achieve some very realistic lighting effects with scanline rendering, essentially fake lighting, rather than using computationally expensive ray tracing that would achieve a similar effect. I'm sure most have seen that GTA4 lighting mod with an image of a yellow cab. Quite a lot of people believed it was a real photo.

Also, that Euclideon stuff should be taken with a grain of salt. It's essentially a glorified tech demo showing a completely static, pre-calculated environment with no deformation as in nothing moves/changes other than the cameras position. A bit useless if you want to make a video game.
 
As has already been said, the tech isn't slowing down it just seems to be advancing in other areas.

This really :-) For example, I have an all-in-one PC. All it is, it's just a monitor but with the components of a computer inside it too without making the monitor bigger. It's still about an inch thick. The actual components are mostly mobility versions like a 7850M instead of a 7850 and sodimm RAM instead of DDR, although the CPU I believe is a desktop processor - an i5-3470U. It plays World of Warcraft (with the new models) fine under 'good' settings, and I like the fact that it's a PC at only the footprint of a monitor and just 1 wire going into it (power).
 
Tech must be slowing down, I remember back in 2007 if you had a rig that was 7 to 8 years old it'd be so slow and crumble running anything intensive, now the rig in my sig, is about 7 years old (feels like yesterday I build it!), yet it could still run most things out there on highest settings, and I could easily upgrade the PC to run the latest games full whack for cheap.

When Crysis first came out, tech was changing so quickly.
 
My PC is also ancient now and can run any game on high settings still.

It was a powerhouse when I built it, true, but I can't remember hardware lasting so long before.
 
Tech must be slowing down, I remember back in 2007 if you had a rig that was 7 to 8 years old it'd be so slow and crumble running anything intensive, now the rig in my sig, is about 7 years old (feels like yesterday I build it!), yet it could still run most things out there on highest settings, and I could easily upgrade the PC to run the latest games full whack for cheap.

When Crysis first came out, tech was changing so quickly.

Why must it?
You example doesn't conclude that at all. In fact it suggest the opposite reason. That games are getting so costly to CGI, they aren't being developed to max out computers.
Which is what is happening. The money need to be spent on artists for each scene is getting to Hollywood levels who spend hundreds of millions on 90minutes of footage, let alone the 20+hours of footage in a game.
 
i feel this is towards smartphones mostly, my lg g2 is just as good as any phone on the market right now
 
True real time GI isn't coming along any time soon. The processing power required to simulate individual light particles is immense. Pretty much all current real time rendering techniques use approximation to overcome processing limitations. You can still achieve some very realistic lighting effects with scanline rendering, essentially fake lighting, rather than using computationally expensive ray tracing that would achieve a similar effect. I'm sure most have seen that GTA4 lighting mod with an image of a yellow cab. Quite a lot of people believed it was a real photo.

Also, that Euclideon stuff should be taken with a grain of salt. It's essentially a glorified tech demo showing a completely static, pre-calculated environment with no deformation as in nothing moves/changes other than the cameras position. A bit useless if you want to make a video game.


I know that's why I said 100x, at least, processing power, I took a lot of that info from Tim Sweeney's talk, actually I think he mentions 2000x processing power at one point.

I think the Euclideon people have made strides with animations using this point cloud method, but I'm sure your right although the video I posted does give a lot of general info about current rendering techniques to the point where you could argue we have reached a plateau of sorts. Although according this this article 2 games are in the works using Euclidean, http://techreport.com/review/27103/euclideon-preps-voxel-rendering-tech-for-use-in-games

There is another company doing similar, Atomontage engine, don't know if you know about this, don't myself.

Really adding more 3d modellers and creating even more detailed worlds, maybe up polygon count and texture resolution a bit and that's all that can be done with current engine tech.So in a sense processing power is maxed out with regard to game studios budgets.

It's not like we are just breaking in 3D graphics anymore or running out of ram or cpu power or learning new algorithms to serialise and stream data. All the maths is known, apart from GI as you said, there isn't really anywhere left to go apart from higher poly counts.
 
Last edited:
Really adding more 3d modellers and creating even more detailed worlds, maybe up polygon count and texture resolution a bit and that's all that can be done with current engine tech.So in a sense processing power is maxed out with regard to game studios budgets.

That's essentially the key bit to all this. Some of the stagnation is a result of the market. Crysis, how ever graphically amazing it was, probably hasn't made anywhere close to the amount a game like LoL has made with it's F2P, micro-transaction business model. A game like LoL arguably took less resources to make as well. Why bother spending loads of money to make a graphically amazing game when you can sell in-game items for $5 a piece? You can use CS:GO too which is a recycled CS:S. They already had a game to work with so a massive development cost reduction right from the start.

It's all about perceived value to the customer in the end and making your avatar stand out against everyone else is big business. People spend more on virtual in-games than they would willingly spend on a complete game.

I feel this has contributed to it greatly. You're going to see a lot more games like this and less single player games that push graphical boundaries. It'll no doubt go too far at some point but right now it's a massive earner.
 
PC's are personal computers, I suppose you could lump consoles and mobile phones in with that for arguments sake.

As mentioned above, computational power in general is not slowing down, far from it.

Just look at the new 16 petaflop computer commissioned by the met office. This will place it around 4th or 5th place in the worlds top super computers, probably 6th or 7th by the time it's operational.

The average user just doesn't need the power, I have several devices in my house and one in my pocket which in retrospect are almost inconceivably more powerful than the computers first used to fly man to the moon.
 
Last edited:
Technology isn't slowing down, it's going forward at full speed. We have powerful PC's in our pockets which took what, 6-7 years?

The idea that computing power will continue to grow forever is flawed, it won't and it's not supposed to do that anyway. There are physical boundaries or cost boundaries that will sooner or later halt certain aspects of technology which is why its evolution shouldn't be seen as a continuous line. Instead, try to picture it as the branches of a tree. When a branch grows, it eventually stops but it also thickens and it allows other branches to grow in different directions. That same is happening in medicine.

Regarding PC's, we've seen little gains in the field of processors but significant gains for graphics cards, SSD's or broadband speed. Graphics cards could not have evolved if the processors (the main branch) hadn't reach their maturity.

I don't know where technology will take us in the future but it definitely hasn't slowed down.
 
Yeah isn't like the old days where you upgraded you actually got a noticeable improvement, benchmarks aside. For me that was when i went from dual core to i7 930 quad core, since then its been pretty meh. hardy noticed any difference with the i7 haswell. Plus they used crap TIM! so overclocking isn't great either.
 
Moore's Law ended some time ago.

GPU's seem to have been stuck at the 28nm process size for an eternity.

CPU's seem to have stopped improving since 2012 Ivybridge. RTS games desperately need more single threaded performance for better computer player AI and greater unit counts. Also the latest MADVR video scaling algorithm NNEDI3 needs vastly more performance.
 
Moore's Law ended some time ago.

GPU's seem to have been stuck at the 28nm process size for an eternity.

CPU's seem to have stopped improving since 2012 Ivybridge. RTS games desperately need more single threaded performance for better computer player AI and greater unit counts. Also the latest MADVR video scaling algorithm NNEDI3 needs vastly more performance.

Indeed, the question of 'how many units can I have?' should simply be countered with the question: 'how much RAM have you got?'
 
Still rocking strong with my 2600k and an ATI 6970. Granted I can't play stuff like Metro: Last light on max settings but I can't see me upgrading soon. The next upgrade will probably come from a windows 10 upgrade
 
Back
Top Bottom