"double buffer" vsync (the most common mode) stores an extra frame in the framebuffer, and waits until the screen refreshes before outputting it, to avoid tearing. "Triple buffer" vsync is similar, but stores two extra frames in the buffer.
Double buffer vsync will only allow your framerate to be an integer division of your monitors refresh rate: That is, you will only be able to get 60fps, 30fps, 20fps, 15fps, 12fps, 10fps (etc). If your GPU is not capable of rendering 60fps, your framerate will drop to 30, and so on. These sudden changes in framerate can be a little annoying. Triple buffer vsync on the other hand allows you to output frames at any framerate.
The downside to enabling vsync is a lag between giving a command and seeing it on the screen (due to the frames being stored in the buffer for a time). This can make a game feel sluggish and "muddy". Triple buffer will have a higher lag than double buffer, and a 120Hz monitor will have a much lower lag than a 60Hz screen. The exact amount of lag depends on where the screen is in its refresh cycle when the frame is completed, but the range of lag values is as follows:
* 60Hz double buffer ('regular') vsync: 16.7 to 33.3ms lag
* 60Hz triple buffer vysnc: 33.3 to 50ms lag
* 120Hz double buffer vsync: 8.3 to 16.7ms lag
* 120Hz triple buffer vsync: 16.7 to 25ms lag
For this reason a lot of people consider a 120Hz screen to be great for gaming, even if they aren't using it for 3D. It allows vsync to be enabled with very little extra lag. Also you get more "divisions" in double buffer vsync [120, 60, 40, 30 etc]. Dropping from 60fps to 40fps is not nearly as annoying as dropping from 60fps to 30fps.