Crysis and very low CPU load

So if a 8800GTS with 1280x *** (your res 4:3 or such) can MAX the game ( He actually said a 8600GTS at Mid RES and High settings in a previous interview AFAIR) what can a 8800GTX/Ultra do with it ?.
I have a 8800GTS overclocked to 650/2000 and can NOT max the game in any way, I tried it last night to run on Very High setting without FSAA and it ran at 13fps, I have to run at High/Medium to get a framerate of 30 which is acceptable.

What he said is not what we experience, sure I can run the game but at 13fps I can't really call it playable and the aliasing is very bad in places.
 
I don't know how much money Crytek expect to make from licencing-out the underlying game engine as opposed to direct Crysis sales. Maybe this is sufficiently important to them to need to guarantee the currency / up-to-dateness of the engine for a significant chunk of time (say 18 months or so) after the release of Crysis the game. As such they may have made a commercial decision which risks cheesing a few people off today for financial gain tomorrow.

Maybe there will be no miraculous optimisation for the final release...maybe this is just how Crysis plays on today's stuff. (I hope that this isn't the case by the way!)
Far Cry wasn't any different, it also ran very badly when it came out (on the GPU I had at the time)

I see Crysis as a marketing campaign for their engine so developers can license the engine and develop a new game to be released in 2008 when we all have faster hardware.
 
Far Cry wasn't any different, it also ran very badly when it came out (on the GPU I had at the time)

I see Crysis as a marketing campaign for their engine so developers can license the engine and develop a new game to be released in 2008 when we all have faster hardware.

This is true, i remember when FarCry came out i had to upgrade to a 9800 pro as my 2 year old ti4600 wouldnt run it on anything other than low. At least now my 2 year old 7800GTX can run crysis on medium.
 
Very weird bugs aside I ran farcry ay 1024x768 and nice settings (deff was not low or even medium but cant remember the settings) on a TI4600 and a XP3200 (32bit Barton), then got a 6800Ultra when the 1.3 Patch gave it DX9C.
 
Last edited:
Very weird bugs aside I ran farcry ay 1024x768 and nice settings (deff was not low or even medium but cant remember the settings) on a TI400 and a XP3200 (32bit Barton), then got a 6800Ultra when the 1.3 Patch gave it DX9C.
You might have been running it set to higher settings, but you certainly wouldnt be seeing the same thing on screen as with the 6800.

You will have been limited to the capabilities of the card and probably lost fps as SW tried to emulate those functions.

Your memory is probably playing tricks and Far Cry really did look that good at low settings :p
 
I have a 8800GTS overclocked to 650/2000 and can NOT max the game in any way, I tried it last night to run on Very High setting without FSAA and it ran at 13fps, I have to run at High/Medium to get a framerate of 30 which is acceptable.

What he said is not what we experience, sure I can run the game but at 13fps I can't really call it playable and the aliasing is very bad in places.

what cpu r u running because i have a 8800GTS oc@ 660/2000 and on high get around 27-35 fps with 4xAA.

Also tested very high and i get 13-17 fps

res 1280x1024
 
You might have been running it set to higher settings, but you certainly wouldnt be seeing the same thing on screen as with the 6800.

You will have been limited to the capabilities of the card and probably lost fps as SW tried to emulate those functions.

Your memory is probably playing tricks and Far Cry really did look that good at low settings :p

Well when FarCry came out the TI4600 was best I could get from Nvidia as the FX5000's sucked and there was no 6000's till later as the April Launch was a Paper Launch and it took till November for me to find one.

The game did need slight overclocking of that card which was an already OC version, they were very good cards in their day and before the Farcry patch the TI4600 was DX8.1 same as the 6800Ultra would be without 1.3 Patch. (only the TI range of 4000 series were DX8.1).
 
Last edited:
Well when FarCry came out the TI4600 was best I could get from Nvidia as the FX5000's sucked and there was no 6000's till later as the April Launch was a Paaper Launch and it took till Novemvber for me to find one.

The game did need slight overclocking of that card which was an already OC version, they were very good cards in their day and before the Farcry patch the TI4600 was DX8.1 same as the 6800Ultra would be without 1.3 Patch. (only the TI range of 4000 series were DX8.1).

Very true, wasnt a Gainward Golden Sample by any chance?
That was my FC baby to begin with, but at the time I only had 256MB and an 1800XP to keep it company...
 
I had a 5900XT Golden Sample, and I remember that Farcry was the game that exposed Nvidia's shambolic behaviour with DX9 on the 5x00 graphics card range. I remember one particular example was comparing the rusted missiles in the ship on the first level - if you had a 5900, the bumpmapping and sheen would not appear on it, and if you had a 9800 there would be far greater detail on it, with more frames per second to boot.

It put me off Nvidia so bad, that it wouldn't be until the 8800GTS that I finally jumped back onto their ship.
 
Am starting to feel Quadcore wasnt going to save Crysis framerate afterall.

Still Q6600 is perhaps the most decent chip for anyone out there even if it isnt used.

Still will be interesting to see the Final release and see the true performance.
 
you can NOT take gpu load off and put it on the gpu. cpu determines what to draw tells the gpu, the gpu draws it. if you look at crysis, theres a LOT to draw, massive draw distances, lots of detail lots of everything. it looks superb, drawing a image that looks that detailed takes power, hence gpu load is extremely high.

it does support multicore, as i've been saying for months, it doesn't NEED but SUPPORTS multicore. its a marketing buzzword that people fall into. people buy dual/multicores, they think games will be better if they use them, so developers support them and go around easing everyones mind to say they support them.

same with valve's multicore support, it will not NEED it, it will support it and won't make a lick of difference.

almost every big release of the last 10 years and the next 10 will be gpu limited. the only small subset of games is rts's. because they are essentially massive driven databases's with a 3d graphical overview of whats going on.

alan wake supports quad cores, and will most likely use a similar amount of juice, most games from now on will support quad cores and will not use that much power on each core.

So my opty 185 could last for years then? at 2.78 crysis uses between 65 - 80% of both cores.
 
Back
Top Bottom