Crysis capped @ 30FPS.?

Same here - I usually disable vsync altogether in FPS games. The input lag is annoying, but not nearly as much as the sudden jumps in framerate (which to my eyes is worse than a consistently low framerate). I've dabbled with triple buffer, but the input lag issue becomes a little worse. For the most part, tearing isn't all that noticable.

I also remember that with CRTs the tearing generally becomes less noticable as the refresh rate rises (I mean without vsync of course). I'd be interested to see if the same is true of 120Hz TFTs as well.

Anyway I generally only use vsync with RTS games.
 
Actually it's not total bull.

Yes, if one single frame is under '60fps' (i.e. frametime of 16.67 or greater), then you'd likely see an average reported framerate of ~59fps. But if the game is consistently churning out framerates in the 30-59 range (i.e. each frame is taking 16.67-33.33ms to render, then you WILL get a reported framerate of 30fps with vsync and no triple buffering, because none of the frametimes are low enough to make it higher.

Yes, I agree with what you're saying there - the example you go on to list is constantly taking approx 16-32ms per frame, which with vsync will give you 30fps. What I was trying to say (and that you point out above) is that it is possible to get something other than a rounded down framerate if only a few of the frames each second take over 16ms.

The other posts just make it seem that people are saying it's either 60fps, 30fps, 20fps (etc), or nothing at all.
 
Well..... while it's true in the literal sense that you won't always get 30 frames per second, what it does mean is that frametime WILL always round up to a multiple of ~16.67ms (or whatever the inverse of your refresh rate is). This means that the effective framerate measured on a per-frame basis will be 60'fps', 30'fps' etc with triple buffering off.

Part of the problem is that culturally gamers have got used to the idea of frames per second when in reality a second is a pretty long time to be measuring framerate over. If your system renders 48 frames in half a second and then takes another half a second to render another two frames, that gives you 50fps but the reality is that it won't be smooth at all because you've got a couple of frames taking like a quarter of a second to render each (equivalent of 4fps if scaled up). Extreme example that would be more subdued in reality, but you can see what I'm getting at.

Discrete framerates (or frametimes) measured over a smaller interval are definitely the way forward in terms of measuring performance.
 
triple buffering allows 1/4 steps in output framerate - ie: 45fps for a 60hz sync which is better, but at the cost of some speed usually due to using two back buffers and obviously requiring more bandwidth.

No, triple buffering removes 'steps' completely.

Well..... while it's true in the literal sense that you won't always get 30 frames per second, what it does mean is that frametime WILL always round up to a multiple of ~16.67ms (or whatever the inverse of your refresh rate is). This means that the effective framerate measured on a per-frame basis will be 60'fps', 30'fps' etc with triple buffering off.

Incidentally, this will likely feel worse than a consistent lower FPS because the longer frame times will manifest as stuttering.
 
Turn vsync off, install Rivatuner, select triple buffer, no tearing improved frames (according to this months PC Format that is!!!

I think Ill save Crysis for when I purchase a directx 15 card.. that way it may run suitable for decent gameplay!!



I thought Tripple Buffering was only made for OpenGL not directx.
As T/Buff is only in the OpenGL section ?
 
I thought Tripple Buffering was only made for OpenGL not directx.
As T/Buff is only in the OpenGL section ?

Use "D3D overrider" to enforce triple buffering in direct 3D.

For some reason, neither AMD nor Nvidia seem to think that triple buffering is not a high priority, and so they do not support it through direct 3D.
 
Same here - I usually disable vsync altogether in FPS games. The input lag is annoying, but not nearly as much as the sudden jumps in framerate (which to my eyes is worse than a consistently low framerate). I've dabbled with triple buffer, but the input lag issue becomes a little worse. For the most part, tearing isn't all that noticable.

I also remember that with CRTs the tearing generally becomes less noticable as the refresh rate rises (I mean without vsync of course). I'd be interested to see if the same is true of 120Hz TFTs as well.

Anyway I generally only use vsync with RTS games.

Yes but everything looks nasty without vsync on. Even with triple buffering on, to me the tearing is unbearable. Besides for input lag i set the maxfps to 59.9 and it smooths all mouse actions out.

Each to their own i guess, but everytime i turn vsync off to bench i can't wait to turn it back on cos its just too nasty to look at and enjoy
 
Use "D3D overrider" to enforce triple buffering in direct 3D.

For some reason, neither AMD nor Nvidia seem to think that triple buffering is not a high priority, and so they do not support it through direct 3D.

Is this the right way. as i still get tearing even tho it's all on.
Sorry about the big white patch, not sure how that happend :p


2pto5xj.jpg
 
If vsync really halves the FPS, why does fraps and xfire report a higher FPS than 30 (or 45, as I have triple buffering forced in the nvidia panel for all games) but not 60? I regularly get fps like 50-60 or 30-40, it's never really stuck at 30 unless there's some rubbish frame limiter in place like the stock vsync in Dead space or pre gtaiv gta games's frame limiter.
 
turning vsync on is fine,there nothing wrong with it,i have all turned on and i get no trouble whatsoever,i would say its your GPU,ATI cards do not work as well with crysis as nvidia do,you use a budget gpu,you get budget type of performance especially on ethusiast settings.
 
Yes but everything looks nasty without vsync on. Even with triple buffering on, to me the tearing is unbearable. Besides for input lag i set the maxfps to 59.9 and it smooths all mouse actions out.

Each to their own i guess, but everytime i turn vsync off to bench i can't wait to turn it back on cos its just too nasty to look at and enjoy

It depends strongly on the monitor you have, as well as individual sensitivity. On my current screen (Hazro HZ30W), tearing is hardly noticable to me (except in areas with strobe lighting).


If vsync really halves the FPS, why does fraps and xfire report a higher FPS than 30 (or 45, as I have triple buffering forced in the nvidia panel for all games) but not 60? I regularly get fps like 50-60 or 30-40, it's never really stuck at 30 unless there's some rubbish frame limiter in place like the stock vsync in Dead space or pre gtaiv gta games's frame limiter.

Triple buffer does not "stick" on a particular framerate - you can have any framerate within the allowed range of your monitor (usually 0 - 60). The downsides of triple buffer are increased input lag, and a slight decrease in available video memory (which is negligible compared to the amount of memory on modern GPUs really). This is a good deal compared to regular "double buffer" vsync which significantly reduces the output framerate (as discussed above).
 
Last edited:
It depends strongly on the monitor you have, as well as individual sensitivity. On my current screen (Hazro HZ30W), tearing is hardly noticable to me (except in areas with strobe lighting).




Triple buffer does not "stick" on a particular framerate - you can have any

Well i can't speak for a thousand pound screen, but for 99% of us out there who do not spend that kind of money on a screen, its down right pig ugly.

Besides my U2410 is no slouch yet its unbearable. To me the smoothness is so far gone without vsync, i just don't see the point in spending any decent money on good equip if its going to run that crap anyway.
 
Well i can't speak for a thousand pound screen, but for 99% of us out there who do not spend that kind of money on a screen, its down right pig ugly.

Besides my U2410 is no slouch yet its unbearable. To me the smoothness is so far gone without vsync, i just don't see the point in spending any decent money on good equip if its going to run that crap anyway.

Heh, the screen only cost me £600 from overclockers :p

But anyway, I also had a 24" Iiyama screen (TN panel, can't remember the exact model, but it was sub-£250 at the time). The tearing was slightly more noticable than with the Hazro, but still not so bad as the framerate jumps that come with double-buffer vsync.

I guess it's largely down to the eyes of the individual. Still, I wish triple-buffer was better supported, especially with multi-GPU setups.
 
If vsync really halves the FPS, why does fraps and xfire report a higher FPS than 30 (or 45, as I have triple buffering forced in the nvidia panel for all games) but not 60? I regularly get fps like 50-60 or 30-40, it's never really stuck at 30 unless there's some rubbish frame limiter in place like the stock vsync in Dead space or pre gtaiv gta games's frame limiter.

Ditto. i hate tearing :p
when i changed to win 7, i was getting stutters in some games(but not all) with sync on.
And for me it was the hertz. i could select 60hz, but when i go back to it, it would say 59hz. now i've soted out the 60hz to stick, and not 59hz.
I get nice smooth game play now :). with sync off, the frame rate was high, but the tearing is pants for me anyway. some people are lucky enough not to notice tearing that much. so i guess each to there own :p
 
Back
Top Bottom