• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Which card will crack Crysis?

new1lh0.jpg
 
Uhm??

A 8800GT sli should do it, I'm getting 30 average fps all high with 4 x aa & 16x AF @ 1280x1024 with a dc cpu that is (atm) @ 2.68 ghz in windows xp, so gfx= heavily bottlenecked.
I see no reason an 8800GT sli system with a quad @ 3.6 ghz can't run 50 fps.

HAHAHAHAHAHAHAHAHAHA
such lies and rubbish i've not heard in a long time

Crysis's CPU requirements can easily be met by a Core2Duo at about 2GHz. Crysis is NOT a cpu-heavy game!
crysis is completely graphics bottlenecked by every card on the market.
 
Last edited:
all crysis needs is heavily patched, give crytek time im sure they'll deliver, look at far cry, used to test the systems of the day getting not so great frame rates, but that was constantly improved as time went on until it was virtually perfect performance wise, have faith fellow crysis gamers :)
 

What you on about ?

HAHAHAHAHAHAHAHAHAHA
such lies and rubbish i've not heard in a long time

Crysis's CPU requirements can easily be met by a Core2Duo at about 2GHz. Crysis is NOT a cpu-heavy game!
crysis is completely graphics bottlenecked by every card on the market.

I'm not saying Crysis needs such a cpu, I'm saying the gpu is heavily bottlenecked, I've tested it in 3dm06 myself and the sm2 and 3 scores are 20 % lower @ 2.13 ghz c2d than @ 2.88 ghz.

Anyhow I've benched it b4 and I was sure it was 30 fps and stop laughing as you're ignorant as hell :

( the benchy I did b4).
 
that seems about right for G92 GTS, whats your card clocked at, i don't get much more than that with C2D at 3.8Ghz and GTS running 750 core, 2000 shaders, 2000 memory :confused:
 
Stock card clocks, benchy was done with cpu @ 2.88 ghz or was it 2.6x like it is now I can't recall.
Just close all rubbish for benching: just xp essential processes and crysis...
 
Stock card clocks, benchy was done with cpu @ 2.88 ghz or was it 2.6x like it is now I can't recall.
Just close all rubbish for benching: just xp essential processes and crysis...

ah right, your using XP, im using vista x64, might install XP on my other drive and see if it helps :)
 
Rebenched to be sure it were these settings and samish results, 1 fps less, migth have been running more cpu speed the first benchy but just restested to be sure and this is definatly the performance with my specs & settings:





The GPU benchmark means nothing, in-game performance will be lower substantially.

Probably less than 15 fps in most places.


RLY?
I've started on the third level now and still no lag/poor fps, perhaps on some ice level I hear people on about, but so far I haven't reached it yet and till now performance was great, I haven't even benched at first (for fps) when I got the game but it got me interested later to compare, so far I'm pretty sure I get those fps as I've never had any lag yet ingame.
 
I played the game through happily on my Opteron 165 @ 2.8 and X1900xt at 1280x1024 with all settings high except for shader & shadow set to medium with some tweaking of the config files. A bit choppy in places but not really detrementally so - still awesome.

I am replaying it on my new rig (in delta mode) DX10 at 1280x1024, 2xfsaa, all settings V High except for shader & shadow set to high ... to me this is a similar playing experience and I for one am enjoying it ...

The GPU timedemo benchmark in 32 bit Crysis gets me:
==============================================================
Play Time: 60.40s, Average FPS: 33.11
Min FPS: 23.52 at frame 138, Max FPS: 42.59 at frame 72
Average Tri/Sec: -29889176, Tri/Frame: -902717
Recorded/Played Tris ratio: -1.02
TimeDemo Play Ended, (4 Runs Performed)
==============================================================

The 64 Bit timedemo gets me:
==============================================================

Play Time: 72.97s, Average FPS: 27.41
Min FPS: 18.66 at frame 158, Max FPS: 41.46 at frame 74
Average Tri/Sec: -24757792, Tri/Frame: -903349
Recorded/Played Tris ratio: -1.01
TimeDemo Play Ended, (4 Runs Performed)
==============================================================

Q6600 @ 3.7 GHz (not that this makes any difference :)
GTX @ 645/1030
Drivers set to performance (set to application controlled loses 2 fps but I can't see any difference!)
 
Last edited:
We need to wait until the drivers are optimised.

We've been saying that for over two years now!

SLi and Crossfire are marketing junk in all but a few games where the difference is noticeable. I use 2x 3870XT and I think I've played around 3 out of about 10 games with definite and worthwhile speed increases, the rest of them either didn't work correctly or made no difference.
 
Sorry but 1280x1024 is a LOW res anyway. We need to be able to get 60fps+ average and never drop below 35 to be properly playable and that should be @ 1680x1050 and 1920x1200.
 
Can you imagine the problems with Tri-SLI and quad 3870X2 crossfire?? Makes me scared just thinking about it.

But seriously, the quicker GPU companies move away from dual card setups the better. The gain in performance isn't enough to merit the last two weeks that I have spent trying to get it to work. And it still doesnt, I get flickering in Crysis right now with my dual 3870's.

To me it seems an enthusiast experiment gone wrong. Nvidia showcased crazy two graphic card beast and didnt believe there was a market, but when they found out they could sell it they did. And since then drivers have been lagging behind, like ShinOBIWAN said, for about 2 years.
 
1680x1050 all high with about 2xAA maybe 4xAA with a 3870X2 or above I would guess would be okish.

Anyone claimed they are using 16xAA yet?
 
Uhm??

A 8800GT sli should do it, I'm getting 30 average fps all high with 4 x aa & 16x AF @ 1280x1024 with a dc cpu that is (atm) @ 2.68 ghz in windows xp, so gfx= heavily bottlenecked.
I see no reason an 8800GT sli system with a quad @ 3.6 ghz can't run 50 fps.

I think the thread started meant on High resolutions. 1280*1024 is ancient
 
Back
Top Bottom