Associate
- Joined
- 3 Jul 2008
- Posts
- 378
The main purpose of this thread is to determine in what games triple buffer vsync works, as well as to explain what TB vsync is and does. There is an excellent utility which comes with Rivatuner, called D3D overrider. This is something of a 'fire and forget' program, which you leave running and forces triple buffering every time you select vsync to be on. I have yet to find a game where it does not work, but I only have access to nvidia cards. It would be nice to discover the full extent of compatability with ATI cards, and if there are any games with which it does not work, although users have now reported that it works properly (see posts below).
What is triple buffer vsync?
To answer this we first need to consider 'tearing' and 'double-buffer vsync'.
* Tearing:
When your graphics card renders a frame, it outputs that frame to the screen as soon as it is done. All monitors render images from left to right and from top to bottom. When the new image is sent to the monitor, by default it starts drawing immediately with the new data. Now, the new frame will be slightly different to the old one, so what you see is a "tear" in the image at the place where the monitor had got up to when the frame was updated. This is something you've probably all seen at some stage. It is certainly very annoying!
Now, the effect of tearing lessens as your framerate rises - for the obvious reason that as the framerate rises the difference between the two consecutive frames tends to be less. But still, the tearing is there and can still be very noticable in certain circumstances (strobe lighting for example...). So we want a way of removing it at all framerates.
* Vsync:
Vsync, short for vertical-synchronisation, basically stores an image in the frame buffer, and waits for the monitor to finish drawing one image before displaying the next. This eliminates tearing.
'Standard' double-buffer vsync uses the pre-allocated frame buffer to store the image. 'Triple-buffer' vsync allocates an extra space in the video memory, on which to store the buffered frame.
Virtually all games use the standard double-buffer vsync option, but in my opinion this option has a lot of drawbacks compared to the triple-buffer option!
So what are the advantages and disadvantages of TB vsync compared to standard vsync?
'Standard' double-buffer vsync uses the pre-allocated frame buffer to store the image. This doesn't use any extra video memory, but the disadvantage of this is that *no rendering* can be done while an image is stored in the framebuffer! The effect of this, which you may well have experienced for yourself, is sudden drops in framerate.
Suppose your monitor operates at 60Hz (like most LCDs). If your card is capable of outputting at least one frame every 1/60th of a second (ie it can render the game scene at over 60fps), then your framerate will appear as 60fps. However, as soon as the frame time goes slightly above 1/60th of a second (ie 59.9fps effective rendering or below), you will see only 30fps. Now, this performance dropoff can be annoying, but the sudden transitions are far more distracting than the actual loss of performance itself. The same thing happens when your theorectical framerate drops just below 30fps - you see a jump down to 20fps (=60/3). Then 15fps (60/4), then 12fps then 10fps etc etc.
On the other hand, triple buffer vsync allocates an extra space for the buffered image. The GPU can carry on rendering the next frame while it stores the buffered image - so, there is no sudden drop in performance! This is obviously a massive benefit compared to double-buffer vsync, but there are also two commonly identified drawbacks:
a) The buffer uses some video memory to store the extra frame
... this is true, but in my eyes it's not an issue at all, these days. I mean, an uncompressed 32bit colour image at 1920*1200 uses 8.789Mb of space. Surely you would give up 9Mb out of your 512mb or 1024mb of video memory not to have performance drops of half?!
b) Input lag
... this could well be a real issue for 'twitch' online gamers. Because the frame is buffered, there is always a one-frame lag between what is being rendered and what you see. Now, at 60fps or above this is very small - <17ms - way faster than human reflexes. However, at low framerates it can begin to be noticable. At less than 20fps you get greater than 50ms lag, which can just start to become noticable. Combine this with any monitor-based input lag (from PVA etc panels), and any online ping, and you may start to notice the lag. The bottom line here is that if you play fast paced online games and don't get a good framerate, then turn vsync off entirely and put up with the tearing!
To my eyes, these drawbacks are really small compared to those of 'regular' double buffer vsync, and it annoys me that triple-buffer is not enabled as standard these days. After all, it's been around for years!
So, is there any performance penalty for enabling TB vsync?
Actually, very little in real terms. Obviously you use a small amount of video memory, but until you get over 60fps (or whatever your monitor refresh rate is) and start actually skipping out on displaying particular frames, the penalty is very small indeed. Testing TB vsync on Crysis, for example, I see less than 2% performance dropoff using TB vsync compared to using no vsync at all.
I'm sure you'll agree that this performance drop is well worth the elimination of tearing, and is far, far less than the performance drop using standard vsync (which will also skip drawing any frames over 60fps just like TB vsync).
So how do I enable TB vsync?
* For nvidia cards:
Default: In the nvidia control panel, in the game profiles section, you have an option to set vsync type. Set this to 'triple buffer' (shocker!). Then, set vsync to 'force on'.
...Better
: Download rivatuner [try here] and install it. look in the rivatuner install directory, in the folder 'tools', for a program called D3D overrider. Install this program, set it to start on windows start (... or don't), and just forget about it. Now every time that you enable vsync either through a game or through the control panel, you will get triple buffering!
it seems to work extremely well.
* For ATI cards:
Also through the control panel. Only OpenGL games support TB vsync through the control panel though. I'd appreciate it if you could try D3D overrider yourselves to be sure, however final8y has reported good performance with his tri-fired crossfire setup, so it looks promising
What games support TB vsync?
By default, not all games support triple buffering through the control panel. In fact very few D3D games support it directly (I know that Bioshock and Crysis do, but source games do not, for example). However, using the excellent D3D overrider program mentioned above, I have yet to find a game with compatability isses (on my nvidia cards at least).
Anyway, what I'd like from you guys, if you don't mind, is to test a variety of games to determine if there are any which do not support triple buffer vsync even with D3D overrider. I'd also like to see if there is any difference in compatability between ATI and nvidia cards. What you will need to do is this:
1) Download and run FRAPS
2) Set triple buffering to 'on'
3) Run the game
4) Check that there is no tearing, and look at the framerate
5) ... if the framerate moves around naturally, then TB is enabled! Yay! If the framerate sticks at a constant value, like 30fps, and jumps up or down depending on how much is on screen, then it is not working
6) Report back here.
Note, that you guys running high-end systems might not be able to test old games. If you get over 60fps (or whatever your monitor's refresh rate is) everywhere, then there will be no way to tell if TB is enabled or not. With all vsync, once you go over your monitor's refresh rate your framerate will cap out.
I hope this has been of use to some of you guys. I know a lot of people are annoyed by regular vsync, but don't know about triple buffer. But the mean reason for writing this is to determine the level of support by nvidia, ATI, microsoft and game devs, for TB vsync which is in my eyes the superior option.
[/megapost]
What is triple buffer vsync?
To answer this we first need to consider 'tearing' and 'double-buffer vsync'.
* Tearing:
When your graphics card renders a frame, it outputs that frame to the screen as soon as it is done. All monitors render images from left to right and from top to bottom. When the new image is sent to the monitor, by default it starts drawing immediately with the new data. Now, the new frame will be slightly different to the old one, so what you see is a "tear" in the image at the place where the monitor had got up to when the frame was updated. This is something you've probably all seen at some stage. It is certainly very annoying!
Now, the effect of tearing lessens as your framerate rises - for the obvious reason that as the framerate rises the difference between the two consecutive frames tends to be less. But still, the tearing is there and can still be very noticable in certain circumstances (strobe lighting for example...). So we want a way of removing it at all framerates.
* Vsync:
Vsync, short for vertical-synchronisation, basically stores an image in the frame buffer, and waits for the monitor to finish drawing one image before displaying the next. This eliminates tearing.
'Standard' double-buffer vsync uses the pre-allocated frame buffer to store the image. 'Triple-buffer' vsync allocates an extra space in the video memory, on which to store the buffered frame.
Virtually all games use the standard double-buffer vsync option, but in my opinion this option has a lot of drawbacks compared to the triple-buffer option!
So what are the advantages and disadvantages of TB vsync compared to standard vsync?
'Standard' double-buffer vsync uses the pre-allocated frame buffer to store the image. This doesn't use any extra video memory, but the disadvantage of this is that *no rendering* can be done while an image is stored in the framebuffer! The effect of this, which you may well have experienced for yourself, is sudden drops in framerate.
Suppose your monitor operates at 60Hz (like most LCDs). If your card is capable of outputting at least one frame every 1/60th of a second (ie it can render the game scene at over 60fps), then your framerate will appear as 60fps. However, as soon as the frame time goes slightly above 1/60th of a second (ie 59.9fps effective rendering or below), you will see only 30fps. Now, this performance dropoff can be annoying, but the sudden transitions are far more distracting than the actual loss of performance itself. The same thing happens when your theorectical framerate drops just below 30fps - you see a jump down to 20fps (=60/3). Then 15fps (60/4), then 12fps then 10fps etc etc.
On the other hand, triple buffer vsync allocates an extra space for the buffered image. The GPU can carry on rendering the next frame while it stores the buffered image - so, there is no sudden drop in performance! This is obviously a massive benefit compared to double-buffer vsync, but there are also two commonly identified drawbacks:
a) The buffer uses some video memory to store the extra frame
... this is true, but in my eyes it's not an issue at all, these days. I mean, an uncompressed 32bit colour image at 1920*1200 uses 8.789Mb of space. Surely you would give up 9Mb out of your 512mb or 1024mb of video memory not to have performance drops of half?!
b) Input lag
... this could well be a real issue for 'twitch' online gamers. Because the frame is buffered, there is always a one-frame lag between what is being rendered and what you see. Now, at 60fps or above this is very small - <17ms - way faster than human reflexes. However, at low framerates it can begin to be noticable. At less than 20fps you get greater than 50ms lag, which can just start to become noticable. Combine this with any monitor-based input lag (from PVA etc panels), and any online ping, and you may start to notice the lag. The bottom line here is that if you play fast paced online games and don't get a good framerate, then turn vsync off entirely and put up with the tearing!
To my eyes, these drawbacks are really small compared to those of 'regular' double buffer vsync, and it annoys me that triple-buffer is not enabled as standard these days. After all, it's been around for years!
So, is there any performance penalty for enabling TB vsync?
Actually, very little in real terms. Obviously you use a small amount of video memory, but until you get over 60fps (or whatever your monitor refresh rate is) and start actually skipping out on displaying particular frames, the penalty is very small indeed. Testing TB vsync on Crysis, for example, I see less than 2% performance dropoff using TB vsync compared to using no vsync at all.
I'm sure you'll agree that this performance drop is well worth the elimination of tearing, and is far, far less than the performance drop using standard vsync (which will also skip drawing any frames over 60fps just like TB vsync).
So how do I enable TB vsync?
* For nvidia cards:
Default: In the nvidia control panel, in the game profiles section, you have an option to set vsync type. Set this to 'triple buffer' (shocker!). Then, set vsync to 'force on'.
...Better


* For ATI cards:
Also through the control panel. Only OpenGL games support TB vsync through the control panel though. I'd appreciate it if you could try D3D overrider yourselves to be sure, however final8y has reported good performance with his tri-fired crossfire setup, so it looks promising

What games support TB vsync?
By default, not all games support triple buffering through the control panel. In fact very few D3D games support it directly (I know that Bioshock and Crysis do, but source games do not, for example). However, using the excellent D3D overrider program mentioned above, I have yet to find a game with compatability isses (on my nvidia cards at least).
Anyway, what I'd like from you guys, if you don't mind, is to test a variety of games to determine if there are any which do not support triple buffer vsync even with D3D overrider. I'd also like to see if there is any difference in compatability between ATI and nvidia cards. What you will need to do is this:
1) Download and run FRAPS
2) Set triple buffering to 'on'
3) Run the game
4) Check that there is no tearing, and look at the framerate
5) ... if the framerate moves around naturally, then TB is enabled! Yay! If the framerate sticks at a constant value, like 30fps, and jumps up or down depending on how much is on screen, then it is not working

6) Report back here.
Note, that you guys running high-end systems might not be able to test old games. If you get over 60fps (or whatever your monitor's refresh rate is) everywhere, then there will be no way to tell if TB is enabled or not. With all vsync, once you go over your monitor's refresh rate your framerate will cap out.
I hope this has been of use to some of you guys. I know a lot of people are annoyed by regular vsync, but don't know about triple buffer. But the mean reason for writing this is to determine the level of support by nvidia, ATI, microsoft and game devs, for TB vsync which is in my eyes the superior option.
[/megapost]
Last edited: