• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Crysis Kills my 8800GTX

Interesting, same settings on the CPU bench gives over 31. something average, not bad at all that considering the amount of debris thats flying about, its still over 30 frames, and i also played the demo again to see how that would cope with those settings as well, and it coped fine, even when i got to that camp and all hell broke loose it was still perfect. :cool:
 
1 core doing most of the work and I would expect it to take it to 100%, its said to be more so like above on Quads, I aint checked on my Dual yet.

If the game really is more CPU dependant as Crytek's CEO Cevat Yeril, then thats why FPS are bad, they need get it to rape all 2 or 4 Cores to near/100% like games normally do for Single/Dual cores.
One core is loaded more than the others, overall its not overly cpu dependent as far as I can see. Looks like DX9 uses more cpu power than DX10 (as you'd expect). ([email protected])

DX9 32BIT


DX9 64BIT


DX10 32BIT


DX10 64BIT
 
Last edited:
If the game really is more CPU dependant as Crytek's CEO Cevat Yeril, then thats why FPS are bad, they need get it to rape all 2 or 4 Cores to near/100% like games normally do for Single/Dual cores.


Ive maxed out both my cores on a e6400 @ 3.4ghz

I only have a x1950pro so I had to reduce the graphics and set physics to maximum on the cpu benchmark but it definately seemed to be cpu limited in some cases.

Dont use the windows util to monitor the cores, try rmclock or something 3rd party
 
Ive now got the Object Detail on High, and dropped my res down to 1280x960, and it runs a lot better, and looks just as good as 1280x1024. :cool:

Just ran the bench again to see, and with all high, except Shadows and Post processing, which are both still on Medium, at a res of 1280x960, im now averaging 35. something, so with a drop in res down from 1280x1024 to 1280x960, and with the Object Detail upped back to high, im only losing a point of a frame.:)
 
Last edited:
My biggest beef with this, is that while it's playable if you switch most of the settings to medium, it then looks worse than a number of games/demos that have been released recently.

That makes all the graphical prettery fairly irrelevent for the time being. What was the point in advertising all the graphical wonders if we can't use them?

I should point out that the demo suffered from a number of serious bugs on my rig, ranging from crashes, refusels to launch, and visual bugs.

I only played for a short while, and I thought it felt very promising in terms of gameplay, so I'll forgive the graphics. What I won't forgive, is if EA release yet another poorly programmed rushed unoptimised broken game onto the market. I'm getting a bit sick of it, and since this game is highly sought after, I think the community won't forgive it either.
 
Last edited:
You don't have to switch nearly all of the settings to Medium though surely, my rigs all stock, on XP, and with everything maxed apart from Shadows and Post Processing which are both on Medium (2x settings) , it runs great and looks amazing, certainly looks a lot better than anything ive seen.
 
I have a friend running this on an 8600GTS, he runs all High settings with I think Shadows on medium & 2AA, perfectly playable and smooth with his resolution dropped to 1280*1024 I think it is.
 
To people comparing frame rates:

The difference between XP and Vista is *MASSIVE*

On vista I run all on medium, at 1360x768, which gets me 22fps in the GPU demo
In XP I run all on high except shadows (med), same res with 2x AA which gets me 30fps in the same benchmark.

Both running latest beta nvidia drivers.

3.2ghz quad, 2gb, GTS320.

Now DX10 does look better, a lot more immersive, but only when comparing like for like, and at high settings under dx10 the game is unplayable for me. With medium settings I'd say it still looks nicer than XP on high but only marginally, and the frame rate drop isn't worth it.

I hope we see better drivers and more optimised code, either at release or patched in the future, because for a game which is suppost to be showing off vista and dx10 it runs very poorly!
 
i dunno why everyone is moaning so much

at 1680x1050 all on very high apart from post proces gives me 30-35fps sometimes dipping into 28-26 at 1280x1024 all on very high and 4xaa i get 40-50

they said about 8 months ago no current new gfx for a while will ever be able to run this game maxed out only the 2nd gen will be able to get close and they fact this game has other built in hardware textures etc for future cards a year down the line so you should be happy you can run it on high tbh
 
On Windows Vista 64, a qx6700 CPU and Evga 8800GTX, 4GB ram (everything at stock) system at 1280x1024 with settings on very high I'm getting:

Code:
TimeDemo Play Started , (Total Frames: 2000, Recorded Time: 111.86s)
!TimeDemo Run 0 Finished.
    Play Time: 125.12s, Average FPS: 15.98
    Min FPS: 12.37 at frame 1956, Max FPS: 19.49 at frame 874
    Average Tri/Sec: -11176223, Tri/Frame: -699192
    Recorded/Played Tris ratio: -1.31
!TimeDemo Run 1 Finished.
    Play Time: 122.17s, Average FPS: 16.37
    Min FPS: 12.37 at frame 1956, Max FPS: 19.97 at frame 871
    Average Tri/Sec: -11321110, Tri/Frame: -691555
    Recorded/Played Tris ratio: -1.33
!TimeDemo Run 2 Finished.
    Play Time: 122.14s, Average FPS: 16.37
    Min FPS: 12.37 at frame 1956, Max FPS: 19.97 at frame 871
    Average Tri/Sec: -11309894, Tri/Frame: -690708
    Recorded/Played Tris ratio: -1.33
!TimeDemo Run 3 Finished.
    Play Time: 122.09s, Average FPS: 16.38
    Min FPS: 12.37 at frame 1956, Max FPS: 19.97 at frame 871
    Average Tri/Sec: -11315314, Tri/Frame: -690725
    Recorded/Played Tris ratio: -1.33
TimeDemo Play Ended, (4 Runs Performed)

When I put shadows down to low I get

Code:
TimeDemo Play Started , (Total Frames: 2000, Recorded Time: 111.86s)
!TimeDemo Run 0 Finished.
    Play Time: 86.16s, Average FPS: 23.21
    Min FPS: 18.70 at frame 146, Max FPS: 27.94 at frame 1754
    Average Tri/Sec: 21197770, Tri/Frame: 913198
    Recorded/Played Tris ratio: 1.00
!TimeDemo Run 1 Finished.
    Play Time: 83.15s, Average FPS: 24.05
    Min FPS: 18.70 at frame 146, Max FPS: 27.94 at frame 1754
    Average Tri/Sec: 22361324, Tri/Frame: 929626
    Recorded/Played Tris ratio: 0.99
!TimeDemo Run 2 Finished.
    Play Time: 83.07s, Average FPS: 24.08
    Min FPS: 18.70 at frame 146, Max FPS: 28.47 at frame 870
    Average Tri/Sec: 22401618, Tri/Frame: 930489
    Recorded/Played Tris ratio: 0.99
!TimeDemo Run 3 Finished.
    Play Time: 82.88s, Average FPS: 24.13
    Min FPS: 18.70 at frame 146, Max FPS: 28.47 at frame 870
    Average Tri/Sec: 22451002, Tri/Frame: 930402
    Recorded/Played Tris ratio: 0.99
TimeDemo Play Ended, (4 Runs Performed)
 
Surely you can on X38 :confused:
What makes you think that?

Though the hardware interconnects are all there for it to work in theory it doesn't due to nvidia being selfish with their sli licence. AFAIK you need another discrete chip on the board, from nvidia. This allows sli to function properly. It's why you can get sli on intel laptops now. :)
 
What makes you think that?

Though the hardware interconnects are all there for it to work in theory it doesn't due to nvidia being selfish with their sli licence. AFAIK you need another discrete chip on the board, from nvidia. This allows sli to function properly. It's why you can get sli on intel laptops now. :)

Huh I thought they were for SLi :confused: Ohhh I see they are for Crossfire, I would have thought by now there would be mobo's to support SLi and Crossfire on the same board? Apparently its possible with some hacked drivers or something but god performance is not gauranteed?
 
Nvidia did it on purpose otherwise there own boards wouldnt sell + there coming out with an updated 2x16x pciexpress nforce boards soon.

SLI/Crossfires good but would still rather wait a few months down the road and get 1 single gpu soloution or a 9800GTX;)

Am still hoping final crysis sports full quad core and is a lot more faster, either that or nvidia release crysis stomping drivers....
 
Maybe someone could help me with a few things. How much difference would another 8800gtx in sli make to my scores? I was also thinking of ditching this 8800gtx and going for a 9800gtx but seeing as my mobo doesn't have PCI-E 2.0 I am not gonna change that as well. Is this 9800 properly backwards compatible with the older mobos and I'm guessing this new slot will provide extra features so would it really make a big difference to use an older mobo? Also would the scores for this new 9800 be similar to two 8800gtx in sli?

Cost wise both would probably work out about the same. Say sell the 8800gtx for £250, I'd probably have to cough up another £275 for the 9800. I only bought this system last December so in an ideal world I'd rather not upgrade it but seeing as I will eventually get a 30" monitor I'm guessing I will need the extra power. Though I can't quite see this game working too well at 2560x1600 lol
 
Back
Top Bottom