• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Crysis SP Demo GPU Benchmark Thread

I averaged 36.57 all high.

Just seen your shot of the Nv CP there will, i have Negative LOD BIAS on CLAMP, and i use High Quality Texture Filtering, not Quality.

Gona try it with Allow, and just Quality, see if theres much diff. :D

EDIT: 36.96, so nothing really at those, ill stick to CLAMP and High Quality. :p
 
Last edited:
crysisallhighoc8.png


Im one of those people that still has an ageing gts 320mb :o settings all set at high at same resolution as everyone else. Reasonable performance imo, but not great considering the GT is cheaper.
 
All settings high @ 1280 1024
tteb8.jpg

ddoe9.jpg
ggpw1.jpg

Using 169.04's. Does anyone know why it is saying sli disabled in gpu-z cus it is enabled in the nvidia controll panel.
 
Mmmm looks like I can gain 10 FPS by goin Core 2 Duo. I'm running an overclocked 8800 GTX on an AMD x2 4600 but get the same average as Scooby, even with only a few settings on high.
 
Right i'm not a wind up merchant but why is my bug standard setup....E6600@3ghz ...8800gts 640mb 2 gig geil ram blah blah getting....

img.php


@ 1280x1024 all high

It seems far higher than most peoples (even ones that have better pc's than me) or am i reading it wrong ?

Not on latest drivers either..just whatever was the latest when my card arrived :) ( 162.18)


2 logical explainations ..i'm doing something wrong while benching (unlikely) or the older nvidia drivers are better ?

3rd possiblity...we are all doing the Benchmark_GPU yea ? :) and not the cpu one ?
 
Last edited:
[email protected]
8800GTX@ 648/999/1620
2GB RAM
WinXP x86.

1280*1024
All High / playable
min 29,56/ max 56.18/ avg 46.69

1680*1050,
All High / unplayable
min 17.70/ max 41.34/ avg 36.05

All high, except for medium shaders. / playable
min 31.97/ max 66.96/ avg 51.38

1920*1200
All high, All high, except for medium shaders. / barely playable
min 24.51/ max 52.89/ avg 43.68
 
Last edited:
Yeah, seems like -10 fps less on average. Can the vista users rebench using dx9 mode only, see if there's any difference.
I'm inclined to agree. Do we think the FPS hit is Vista per se or DX10 v DX9? The benchmark seems to run in DX10 mode (escaping out mid way through and checking menu to see if "very high" settings were available seems to verify it). I can’t find a way of forcing DX9 mode for the benchmark but if anyone can tell me how I'll give it a go.


Vista x64, 8800GTX, 1920x1200, all settings at high apart from objects which are at "very high" (the settings i played the demo on)
Using the test file from the x64 bin
Play Time: 91.23s, Average FPS: 21.92
Min FPS: 13.76 at frame 155, Max FPS: 32.13 at frame 61

[Using the test file from the x86 bin
Play Time: 75.58s, Average FPS: 26.46
Min FPS: 19.52 at frame 1952, Max FPS: 32.48 at frame 80

I seem to remember reading somewhere that x64 mode adds additional long range detail (mountains etc) which could explain why x86 is quicker. /shrug
 
Last edited:
In the main game, yes, but the benchmark runs from a .bat file with no options on right clik (other than the normal run as admin etc) :(
 
Right i'm not a wind up merchant but why is my bug standard setup....E6600@3ghz ...8800gts 640mb 2 gig geil ram blah blah getting....

img.php


@ 1280x1024 all high

It seems far higher than most peoples (even ones that have better pc's than me) or am i reading it wrong ?

Not on latest drivers either..just whatever was the latest when my card arrived :) ( 162.18)


2 logical explainations ..i'm doing something wrong while benching (unlikely) or the older nvidia drivers are better ?

3rd possiblity...we are all doing the Benchmark_GPU yea ? :) and not the cpu one ?

So you using 162.18? can you try the 169.04's to see if you loose performance.

Are you sure every setting is set to high and at 1280x1024? you're getting Ultra performance there.
 
Last edited:
Edit the batch file.

@crysis.exe -DEVMODE +map island +exec benchmark_gpu -DX9
Right, interesting results..

So as before Vista x64, running the x86 bin version of the benchmark, 1920x1200 on a GTX

forcing DX9 mode using the above edit to the benchmark:
Play Time: 79.50s, Average FPS: 25.16
Min FPS: 17.67 at frame 1956, Max FPS: 31.11 at frame 982

Using unedited .bat file and so running in DX10 mode
Play Time: 75.87s, Average FPS: 26.36
Min FPS: 19.44 at frame 1965, Max FPS: 32.38 at frame 72

Now in both cases I ran everything at "high" so I'm assuming there are no DX10 additional effects being added. I escaped out from both versions in a seperate run to verify the mode, forcing DX9 meant "Very High" graphic options were greyed out, in DX10 I could hit escape mid benchmark and change options to "V High". In this case DX10 running DX9 graphics seems marginly quicker in this config although it'd be invisible to the eye and well within the margin of error.

I guess any hit in performance is more to do with it running on Vista than DX mode specifically. I suppose it could just be down to variations in peoples machines as well. Anyone else with a GTX running at the above res and detail on XP able to give us a rough comparisson (I know the PC spec and settings will vary but a E6600 or more with a GTX is similar enough to me to let us see if there's a difference in the order of 10fps or more)?

Makes for an interesting way to check out exactly what the hit is for Vista on a game that really stresses the hardware.
 
Last edited:
1280 all on High
Average of 8.3 fps :D

1024 all on Med

Average of 25!:)

E6420 @ 3.4
4 gig of ram
8600 Gt!;) (it plays wow just fine!)
 
Back
Top Bottom