• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

VSync and screen tear

Soldato
Joined
11 Feb 2004
Posts
4,532
Location
Surrey, UK
Can someone explain this to me please..

I'm using an 8800GTS 640MB with an E6600 and 2GB RAM.
Games look ok but there's a lot of screen tear (to clarify that - I'm assuming its where one half the screen refreshes miliseconds before the other?).

Does swithing on VSync prevent this? Where's the setting, in-game or in gfx system settings?

Finally what are the considerations? I.e. in terms of performance?

Cheers :)
 
Yes v-sync stops it as it keeps the frame-rate at your monitors refresh rate, so if you have say a refresh rate of 75, your frame-rate wont go above 75, but with v-sync off and it does go above 75 it will tear like buggery :D ,the setting is in the Nvidia Control Panel, in the Manage 3D Settings (where you do all your AA/AF etc...) its right at the bottom, called Vertical Sync, so if you don't want the tearing set it to Force On, i always have it on as i cant stand the tearing. :)
 
Last edited:
but in terms of performance, when they frame rate drops below your monitors refresh rate it wont just drop a few fps, it will half. Eg. u have vsync on, monitors refresh rate is 60hz and your running about corridors with a nice 60fps. Suddenly u see a lot of action n rather than dropping to say 55 or 50fps it will drop to 30fps which to me is very noticable and annoying
 
Yes - the real killer from vsync is the framerate jumps.

Assume your monitor refresh rate is 60hz, like most TFTs.

If your card is capable of producing 61fps (or anything over 60) the output will be 60fps. But as soon as your cards maximum possible output drops to 59.9fps the framerate will drop to 30fps.

Now, 30fps isn't particularly good. But worse than this is the sudden drop, which really kills the illusion of movement.

You can in theory get around this performance drop by using 'triple buffer vsync', whereby a spare frame is stored in the texture memory to allow the frames to be output at close to the maximum possible framerate. BUT neither nvidia nor ATI have a properly functioning triple buffer for directX :mad: There are various 3rd party beta programs you can use to enable TB vsync, but they are generally buggy and only work sporadically.

Lack of triple buffer vsync is the reason I'm still using a CRT.



Anyway, to answer your question: You can enable vsync permanently in the video card control panel, or alternatively 99% of games these days have an in-game option to enable / disable vsync.
 
luismenendez said:
so Nvidia and ati don't care about customers needs (most of us have TFT and like to run vsync on) and only about max fraps for benchmarking :o

Seems like it. I'd like to believe there is some fundamental issue with triple buffer vsync in directX, but if third party freeware written by one or two part time coders can get most of the way there, surely ATI and nvidia with their billions in research funds could find a way?

TB vsync works fine in open GL though. I can only hope that in DX10 we get some workaround, but I'm not holding my breath.
 
Ulfhedjinn said:
DXTweaker or whatever it's called has a workaround for triple buffering in DirectX games. :)

Not sure how well it works though.

Not so well, unfortunately. Not all games are supported, and with it enabled my crash rate seemed to go up considerably.

Worked well in FEAR though, which was nice.
 
Having read the various replies it would seem the balance leans toward leaving VSynf off - at least until a suitable workaround is available.

I appreciate the responses - cheers guys.
 
Domo said:
Having read the various replies it would seem the balance leans toward leaving VSynf off - at least until a suitable workaround is available.

The balance of replies here might favour leaving it off (as I do...), but I think most people do enable vsync.

Really, the best thing to do is to try gaming with it on and also with it off. It isn't a big job, and this way you can decide for yourself whether the drop in framerate is worth the improvement in image quality from eliminating tearing.
 
i dont have it on if you do get frame tear just max your fps to what mour moniter refreshes as as posted above this will stop suddon jumps in frame rate
 
I'm not sure what people mean when they mention mouse lag with vsync on, I don't think I've ever seen it happen. Is it something that happens when it suddenly drops from 60fps to 30fps, or is it something else?

The FPS halving thing is very annoying, but I still can't live without vsync.
 
pegasus1 said:
What difference does using a CRT make, i still use one by the way.

It's the refresh rate.

With TFTs you're stuck at 60Hz (or 70Hz on some models). At this refresh rate tearing is very noticable.

As you increase the refresh rate the effect of tearing decreases. At 85Hz it's much improved over 60, and at 100Hz it's hardly noticable unless you're looking hard for it (expect for in strobe lighting areas when it's still annoyingly visible).

With a good CRT you can get 100Hz or even 120Hz refresh rates at 1600*1200 resolutions, allowing you to play without framerate drops or mouse lag, and with minimal tearing. There was some talk about bringing out 120Hz TFTs, but until this happens I'm sticking with a CRT for gaming, at least for first person shooters.
 
I've just noticed my TFT (Samsung 913B) has a refresh of 75Hz and its been set to 60Hz in Windows Vista. The difference now I've increased to 75 is quite significant! :)
 
Duff-Man said:
It's the refresh rate.

With TFTs you're stuck at 60Hz (or 70Hz on some models). At this refresh rate tearing is very noticable.

As you increase the refresh rate the effect of tearing decreases. At 85Hz it's much improved over 60, and at 100Hz it's hardly noticable unless you're looking hard for it (expect for in strobe lighting areas when it's still annoyingly visible).

With a good CRT you can get 100Hz or even 120Hz refresh rates at 1600*1200 resolutions, allowing you to play without framerate drops or mouse lag, and with minimal tearing. There was some talk about bringing out 120Hz TFTs, but until this happens I'm sticking with a CRT for gaming, at least for first person shooters.
Cheers mate, happy with that.
 
Back
Top Bottom