Besides the tearing that the quality through a video recording limited to 30fps can introduce is besides the point when i know what 30fps looks like first hand without tearing which still looks un-smooth..
And the fact that fraps was running in the corner showing 40-50 fps
I don't think you fully understand what this technology is actually doing and what its main objective is. Its not about the frame rate itself. Yes, 30fps is not going to look smooth either way, vsynced or not. Gsync will not make 30fps look smoother, but what it will do is make the transition from different frame rates way smoother.
With current vsync, when a frame cannot be rendered in time for a monitor refresh it has to be kept in a buffer until the next monitor refresh. This leads to sudden and intermittent jumps in frametimes, and is the main culprit for the stuttering effect you see when your GPU can't maintains fps that is higher than the monitor refresh rate. There are ways to alleviate this (e.g. triple buffering), but that only leads to greater input lag due to the increased buffering.
With gsync, the GPU doesn't have to wait for the monitor to refresh at a fixed interval, the monitor will simply refresh as soon as the frame is ready. In effect, you have a monitor with a variable refresh rate. Of course we will still have to have a upper limit on how fast the monitor can refresh, i.e. on a 60hz monitor the monitor can only refresh every 1/60th of a second and hence the GPU will have to limit its frame rendering to that too, which is easy to do. On 120/144hz monitors the upper limit will naturally increase.
If you want to see a tiny example of how profound this technology actually is, try playing a 24p movie on your HDTV. Unlike most monitors, HDTVs can change their refresh rates to a wider variety of rates, such as 24hz, 50hz, 60hz. Play a movie shot at 24fps whilst running the TV at 60hz and then at 24hz. The difference in smoothness should be immediately apparent, and not because of the fps itself (which has stayed the same remember) but because of the lack of stuttering/juddering in 24hz refresh mode. At 60hz, some frames are doubled, and some are trippled in an alternating manner so that 24 frames can be fit into 60 screen refreshes neatly. This alternating effectively means that perceived frame rate of the movie is changing every 1/12th of a second, which is what leads to the juddering effect . At 24hz though, the frames are mapped to each screen refresh on a 1 to 1 basis, hence the smoothness.
Now imagine the above situation but you now have a 60hz monitor (that can only be run at 60hz), and a game that is being rendered at a constantly varying fps, usually lower than 60fps. This is what vsync has to deal with and while it has done decent job so far, it has some major drawbacks, namely the aforementioned stuttering/juddering (way worse than in the movie example due to the varying fps) and the inherent input lag; not so much an issue in movies, but unacceptable in games. Of course the alternative is to disable vsync altogether, meaning no more input lag (to a certain degree of course) but you now have screen tearing which many would argue is worse than stuttering. In fact, at the right fps, the tearing itself creates an optical stuttering effect.
Gsync does away with those two drawbacks. The discrepancy between the monitor refresh rate and GPU frame rate is effectively gone and you are now seeing the pure frame output of the GPU, much like you are seeing the pure output of your bluray player when running your TV in 24hz mode. If this isn't an important development, I don't know what is really. I think people have just become so accustomed to screen tearing, vsync stutter and lag that they don't know any better. Heck, I even mentioned this very idea of monitors with dynamic refresh rates on this very forum a few years ago and was laughed off and dismissed. Hopefully this will change that.
Right now I am just hoping that it will work the way nvidia are describing it to, and it is actually a true solution to the problem of fixed refresh rate monitors, i.e. dynamically variable refresh rate monitors. Its possible that nvidia are using some other mechanism to achieve the same effect, but can't imagine what that could be. As I said in another thread, I hope that it becomes an industry standard and not locked to nvidia hardware. It really is that important.