Crysis capped @ 30FPS.?

Associate
Joined
8 Mar 2004
Posts
888
Location
Leicestershire
Sorry for dragging up possibly an old issue but its driving me nuts.

Currently have the following setup
E8400 @ 3.9Ghz
3GB Ram
ATI 4890 gfx

Running crysis warhead in DX9 @ 1920x1080 Enthusiast across most settings.

When i run the benchmark tool i average around 40FPS, yet when in the game it never ever goes above 30FPS.
Ive tried disabling vsync in the game with no joy.
Ive tried enabling triple buffering in the drivers to no joy
I've also tried the "d3d9_triplebuffering 1" in the game to no joy :(

Can anyone point me in the right direction please.?
 
Start the game and see if there is any tearing. If there isn't then somehow you have vsync on
 
Maybe the card can't render >30fps ingame as ingame is more stressful than the benchmark?
Have you tried gamer settings or lower resolution?
 
vsync wont limit your framerate to 30fps.

Yes it will...

Assuming a monitor refresh of 60Hz, vsync will display either:

60fps
30fps
20fps
15fps
12fps

...etc. I.e. integer divisions of the maximum framerate. Double buffer vsync will always behave this way. If the computer cannot render 60fps for a given scene, the output framerate will drop to 30fps (until it can't render 30fps, at which point the framerate will drop to 30fps etc).
 
...etc. I.e. integer divisions of the maximum framerate. Double buffer vsync will always behave this way. If the computer cannot render 60fps for a given scene, the output framerate will drop to 30fps (until it can't render 30fps, at which point the framerate will drop to 30fps etc).
True and I think by enabling triple buffering it will drop to 45fps instead of 30fps.
 
Never knew that, just assumed it added a frame limit equal to your refresh rate to eliminate screen tearing. Didnt realise there was more to it than that :)

Yeah, that's why it's not really ideal with Crysis, because if you drop to 59fps, it'll cut you down to 30fps, or 29fps to 20. You can't really get high enough fps to get tearing anyway.
 
Turn vsync off, install Rivatuner, select triple buffer, no tearing improved frames (according to this months PC Format that is!!!

I think Ill save Crysis for when I purchase a directx 15 card.. that way it may run suitable for decent gameplay!!
 
Yeah, that's why it's not really ideal with Crysis, because if you drop to 59fps, it'll cut you down to 30fps, or 29fps to 20.

Sorry, this is total bull. At 60hz, if one single frame out of 60 takes twice as long to render, but every other frame can render in approximately sub 16ms, then you'll get 59fps - you'll see the effect as a 1 frame lag. It will not reduce you to 30fps.

Don't know where this has all come from. Just go and benchmark Crysis with Vsync on, and you'll see that you're never just locked at 20/30/60fps.
 
Sorry, this is total bull. At 60hz, if one single frame out of 60 takes twice as long to render, but every other frame can render in approximately sub 16ms, then you'll get 59fps - you'll see the effect as a 1 frame lag. It will not reduce you to 30fps.

Don't know where this has all come from. Just go and benchmark Crysis with Vsync on, and you'll see that you're never just locked at 20/30/60fps.

the engine can render at whatever it like, it makes no difference - the output is still half the refresh rate if the engine cant output the same fps as the refresh rate. thats how double buffering work - take the first frame, wait for the second to be draw, combine them and output the result. if the engine is rendering at higher than the refresh rate it doesnt double buffer, if its rendering slower the fps is halved. end of story.

triple buffering allows 1/4 steps in output framerate - ie: 45fps for a 60hz sync which is better, but at the cost of some speed usually due to using two back buffers and obviously requiring more bandwidth.
 
Last edited:
Sorry, this is total bull. At 60hz, if one single frame out of 60 takes twice as long to render, but every other frame can render in approximately sub 16ms, then you'll get 59fps - you'll see the effect as a 1 frame lag. It will not reduce you to 30fps.

Don't know where this has all come from. Just go and benchmark Crysis with Vsync on, and you'll see that you're never just locked at 20/30/60fps.

Actually it's not total bull.

Yes, if one single frame is under '60fps' (i.e. frametime of 16.67 or greater), then you'd likely see an average reported framerate of ~59fps. But if the game is consistently churning out framerates in the 30-59 range (i.e. each frame is taking 16.67-33.33ms to render, then you WILL get a reported framerate of 30fps with vsync and no triple buffering, because none of the frametimes are low enough to make it higher.

Let me demonstrate by means of an example. Game produces 10 frames with the following frametimes:

20ms
20ms
23ms
27ms
22ms
32ms
18ms
19ms
17ms
30ms

With Vsync disabled, that means you've rendered 10 frames in 228ms. or in other words, you are running at an average of ~44fps.

With vsync enabled, all of those frametimes will round up to ~33.33ms (because they exceed 16.67ms). So you are running at an average of 30fps.
 
Transpires that the drivers had vsync forced on so was stuck at 30FPS.
Disabling vsynch in CCC and in game whilst enabling triple buffering has boosted my FPS by at least a factor of 2 thats with 2xAA and all settings on enthusiast.

Thanks for all the replies.
 
Actually it's not total bull.

Yes, if one single frame is under '60fps' (i.e. frametime of 16.67 or greater), then you'd likely see an average reported framerate of ~59fps. But if the game is consistently churning out framerates in the 30-59 range (i.e. each frame is taking 16.67-33.33ms to render, then you WILL get a reported framerate of 30fps with vsync and no triple buffering, because none of the frametimes are low enough to make it higher.

Let me demonstrate by means of an example. Game produces 10 frames with the following frametimes:

20ms
20ms
23ms
27ms
22ms
32ms
18ms
19ms
17ms
30ms

With Vsync disabled, that means you've rendered 10 frames in 228ms. or in other words, you are running at an average of ~44fps.

With vsync enabled, all of those frametimes will round up to ~33.33ms (because they exceed 16.67ms). So you are running at an average of 30fps.

Well put, and a good example :)

Vsync is a pain. Triple buffer is generally superior to double buffer, but is poorly supported (particularly for multi-GPU setups). 120Hz monitors will help ease the pain somewhat, by giving more options (120fps, 60fps, 40fps, 30fps, 24fps etc etc), but it's still not ideal.
 
Well the simple solution is to disable vsync altogether assuming you can live with the tearing (if any); from a gaming perspective it is inherently flawed in that you are creating a delay which can be felt in fast paced multiplayer FPS (admittedly probably not so much in a game like Crysis).

Obviously the higher the refresh rate the better, on a CRT @ 160hz+ the vsync 'input lag' won't be as bad as a TFT @ 60hz, assuming you've got a system powerful enough of course. Then again I don't find tearing so much of a problem at high refresh rates anyway when vsync is off.
 
Back
Top Bottom