• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Have Technology hit a wall?

Runningkid said:
so are we waiting for DX10 to be released then we will see some real improvement? or will it just be the same?
From the Crytek DirectX 10 tech demo it looks like it will be improved a lot. Their will be interactivity between the player and the objects around it and quite a few other improvements. Even though the technology will be there, I think it will be a while before it is used in games.

http://www.gametrailers.com/cgplayer.php?id=9020
 
I find that even in the latest PC games there is not enough detail. The latest graphics cards are capable of stupid amounts of polygons per sec, but we don't see it in games. Textures are used to make games look better.
 
lemonkettaz said:
no...

i think games have hit a massive lead wall.

seriously not taking advantage of any top end technology yet..
and the gameplay has pretty much got stagnent recently. especially in FPS.. it needs something to grab it by the neck otherwise it will die very quickly

Your right there, I just downgraded from a X1800XT, i thought id be missing out, but the games ive played have played identically on a X850XT as a X1800XT. Strangely Toca 3 runs slightly better.

I did this pretty much because fear couldnt run at a constant 60fps with the X1800, so id wait for the next big thing before upgrading, the benchmark difference is impressive of course, but until games makers start utilising the better technology, then i cant see much point in upgrading the technology again.
 
I'm a bit frustrated with it all too tbh. We haven't really seen a VAST improvement over the last couple of years even though we've had 2-3 sets of graphics cards released by both ATI and NV.
I don't really think that hardware is the issue it's the length of time it takes developers to creat a polished game. We all have a lot higher standards that we expect and it takes a long time to create an imersive environment with all the detail that consumers now demand. With further improvements to graphics it's only going to make this process a lot longer and more expensive for developers and during this time the graphics card industry will continue to march on. I do not think we will ever see really optimised games for the PC a la consoles as everyone has different components and it's inherantly impossible to get it right for eveyone.


I currently have an A64 3500+ and an XT850XT PE and I'm still able to play eveything fairly well maxed out @1280x1024 (LCD's native). I am in two minds whether to get a PCI mobo and a shinny new Gcard now or wait for DX10 and the new processors that are coming soon.
Not sure which card would be best either running @1280x1024 as an xt1900 would surely be overkill?
 
I wouldn't expect to see a vast improvement in graphical quality in any one game - any improvements will come in slow, incremental steps.

What you have to realise is that as you approach realism, the impact of each improvement in graphical fidelity is lessened. Going from 2D to 3D was a huge transition, going from low polygon models to high polygon models was a huge transition. Going from one 1M virtual polys (through a normal map) to 5M virtual polys isn't. So I don't actually think games are evolving at a slower rate - UT2007 will have models that are 10X more detailed than those in UT2004, it's just that we're reaching a point of diminishing returns where each increase in detail means less in terms of the reality of the world.

Already now the models in the recently released UT2007 shots don't look as impressive to me. I think - "is that so much better than what I saw in Quake 4", when the actual detail difference is probably a factor of 3 or 4.

I also don't think it's fair to say modern games are badly coded - the nature of their requirements has simply changed. The reason F.E.A.R is so performance intensive is because it uses a very demanding per-pixel lighting algorithm, requiring the scene to be redrawn for each light (and they use 10 - 20 light sources in a scene). This creates dynamic lighting that is much better than what you've seen in Half-Life 2 or Far Cry. But users don't see that way. They've paid for their hardware and they want it to be used in way that delivers certain specific visual improvements.

That's why there's a conflict. Developers now, are focusing on pixel shaders to improve the visual quality, and indeed better lighting is a necessary next step. But better lighting doesn't get you better looking models or better looking textures, so people complain that graphics aren't going anywhere or that games are badly programmed. In the end we'll just have to wait until developers can deliver both.
 
Hawklord said:
I'm a bit frustrated with it all too tbh. We haven't really seen a VAST improvement over the last couple of years even though we've had 2-3 sets of graphics cards released by both ATI and NV.
I don't really think that hardware is the issue it's the length of time it takes developers to creat a polished game. We all have a lot higher standards that we expect and it takes a long time to create an imersive environment with all the detail that consumers now demand. With further improvements to graphics it's only going to make this process a lot longer and more expensive for developers and during this time the graphics card industry will continue to march on. I do not think we will ever see really optimised games for the PC a la consoles as everyone has different components and it's inherantly impossible to get it right for eveyone.


I currently have an A64 3500+ and an XT850XT PE and I'm still able to play eveything fairly well maxed out @1280x1024 (LCD's native). I am in two minds whether to get a PCI mobo and a shinny new Gcard now or wait for DX10 and the new processors that are coming soon.
Not sure which card would be best either running @1280x1024 as an xt1900 would surely be overkill?

I would definetly wait mate after downgrading, as was said Call of Duty 2 benefited from the X1800, as did fear, but neither were able to play at a constant 60fps at 1280x1024, ive been astonished by how everything runs no different, apart from the various 3dmarks really. The other thing is, theres not really that many games coming out over the next 6 months that really will take a benefit from the current top cards, one or two at most.
 
Think of it this way... the longer the games makers take to utilise whats available in todays high end cards the more time the graphics cards manufacturers can pump out slightly faster cards to keep us happy while they can work at their leisure on brand new technology with all new eye watering features that they can eventually slip onto the market when dx10 is up and running whilst saying "Who says our technology isn't taking massive leaps and bounds!" *smug grin*
I'm pretty sure Ati and Nvidia have something up their sleeves, but what is the point in them rushing them out if the software just doesnt exist to utilise it yet? Their time is better off used tweaking/perfecting/creating technology until the time is here that we can actually see the progress in a visible way on our screens rather than just being able to buy a card just for the sake of the price tag and knowing the technology is on it but not used.

If Ati and Nvidia announced tomorrow that they have created groundbreaking dx10/next gen cards and we'd be seeing them on the shelves sometime in the next week you all know that some of you and thousands of others would rush out and buy them knowing full well that todays games cant use the technology yet, then slate the cards in reviews saying they arent performing any better than the last offerings or convince yourselves that the 50 point increase in 3dmark means they must be better than the last ones ;)
 
Back
Top Bottom