Crysis delayed again. Possible 2008 release

ArmyofHarmony said:
I hate monster infested games

humans are much better



this is in relation to farcry
Agreed. As soon as 800 rocket-wielding monsters appear I always think "Yeap, there goes my interest. And my pants". :D
 
ArmyofHarmony said:
I hate monster infested games

humans are much better

this is in relation to farcry

Yes, FarCry took Monsters to the extreme, in the end I used cheats, just to play it until the end and enjoy the excellent graphics.

But in moderation, a smattering of monsters can be OK, they got the balance right in the Thief games and also SS2.
 
DX10 is a bonus on the g80 range the fact is you got the fastest dx9 cards about too i don't see why peeps always saying i feel sorry for anyone thats got a dx10 card now, crysis aint the only game thats dx10 u know.
 
wrong, i buy my g80 cause of alan wake and hellgate london

i have little or no interest at all in crytek games, fetish for jungles, islands etc
 
This is good news as far as I'm concerned, if all the really great looking DX10 games are pushed back a year or so I may be able to afford the upgrades needed to play them properly :)
 
UKTopGun said:
Doesn't acctually have a release date yet and most sites that sell it still say TBC so if not 2008 it will most likely be end of 2007.

But it's gonna be well worth the wait (alan wake) that is :)

Crysis is indeed looking nice but i didn't even complete farcry got boring near the end :(
 
Pitspawn said:
Working on CMR6 for the X360/PS3/PC, not doing Wii yet though ^_^ It was quite a privillage to be in contact with the PS3 before it was released commercially anywhere in the world. The wireless SIXAXIS controller was quite fun although I still prefer the feel of the x360's controller :)

To answer your question, there are still large improvements to be made. Nextgen consoles all have multiple cores and HT technology, and parallel programming is something even the most experienced game programmers are having to learn from scratch. Problems involve having to break problems into smaller tasks and most importantly staying thread safe (need to lock+unlock write privilages to resources to stop a read being corrupted by another threads write)

Right now most developers are utilising parallel processing by dedicating entire threads to whole subsystems like Physics/AI/Audio, this is quite wasteful. But with time comes experience, and so improved engineering skills :)

I really do feel sorry for games programmers. I have recently been reading about multi-threading and basically games just don't suit it all that well. There is only so much you can break it down and sometimes even if you can split a problem up between threads you lose any advantage through having extra code in order to determine when to split something up.

Also as said splitting sub-systems up is not a great idea as it is very coarse grained [ooo technical term] and basically is hard coded for x-number of cores. So as soon as you get your quad/oct/hex-core CPU any extra cores will be sat there doing nowt.

For the time being multi-core CPUs are only showing their usefulness in multi-tasking and also applications that suit multi-threading such as movie and image editing software.

SiriusB
 
Flibster said:
Am I surprised?

No.

Do I care?

No, not really.

It's very pretty. But what does that matter? HL2 was pretty, Fear was pretty... Neither were much good really.

Simon/~Flibster

Blasphemy! HL2 was a great game and so is FEAR (of what i have played, only just started it) but hl2 was amazing don't be slagging
 
-Nick- said:
Blasphemy! HL2 was a great game and so is FEAR (of what i have played, only just started it) but hl2 was amazing don't be slagging

HL2 is hands down the best FPS I've ever played. FEAR is just ok. Amazing firefights though
 
I thought F.E.A.R. was a great game. The slowmo and weapon effects were well done. The AI was a lot better than most too.

Coupled together this made for some very fun fights. :)
 
Flibster said:
Am I surprised?
No.
Do I care?
No, not really.
It's very pretty. But what does that matter? HL2 was pretty, Fear was pretty... Neither were much good really.
Couldnt agree more. Neither were that good.
 
SiriusB said:
I really do feel sorry for games programmers. I have recently been reading about multi-threading and basically games just don't suit it all that well. There is only so much you can break it down and sometimes even if you can split a problem up between threads you lose any advantage through having extra code in order to determine when to split something up.

Also as said splitting sub-systems up is not a great idea as it is very coarse grained [ooo technical term] and basically is hard coded for x-number of cores. So as soon as you get your quad/oct/hex-core CPU any extra cores will be sat there doing nowt.

For the time being multi-core CPUs are only showing their usefulness in multi-tasking and also applications that suit multi-threading such as movie and image editing software.

SiriusB

Yeah and don't forget that some of the best coders in the business (Carmack) have been trying to make a good fist of multithreaded games since the last millenium. He first tried it in Quake3 which supported SMP but only gave around a 10% performance gain.

Single core processors are of course by far the most efficient (in terms of performance per potential calculations) but the problem is that other considerations are now starting to become limiting factors, such as cost/power consumption/heat production and indeed the limits of the engineering process.

Rather than having a quad-core system running around the 2.5ghz mark I'd much rather have a dual-core one running at 4ghz for gaming. I've got a nasty feeling that now the jump to Conroe has been made (very good performance even in single threaded applications), we are going to see a slowdown in the performance improvements. i.e. over the next year or two Intel will be focussing on pushing forward to quad-core as opposed to ramping up clock speeds or indeed drastically increasing cache or coming up with a more efficient architecture.

Heck one could argue that that has been happening for some time, I mean the P4 kinda ground to a halt shortly after the 3ghz barrier, and ignoring the ultra-highend (expensive) FX series AMD never really got much beyond the 2.5ghz (4000+) level. So until Conroe turned up we'd been pretty static for the previous year or two.
 
Back
Top Bottom