So in their demo they didn't show(afaik) the variable framerate with vsync vs g-sync, but no vsync also with a fairly small frame rate differential.
Frame rate at 120hz, the time between frames(skipped frames) becomes dramatically smaller, and vsync would illiminate the tearing.
They compared Vsync in absolute best case for g-sync, and worst case(in the demo, no where near worst case real life) against no v-sync at all. Neither situation happens in real gaming so the demo is all but irrelevant. The closest to real situation is the last part of the demo but v-sync still enabled on the left.
The variable framerate g-sync WAS impressive, no question, but vsync wasn't remotely as bad as they said in the early part of the demo, and comparing the later part with no v-sync is a joke.
In reality, they are saying, if we bung more stuff on screen and give you 30% less frame rate, this will potentially still keep it looking good, Mantle is saying, we'll give you potentially 30% more frame rate preventing the problem in the first place.
Ultimately 120hz, a good screen and a good card will prevent 99% of tearing and "stutter".
Also worth pointing out that their variable framerate was also completely unrealistic. Have you seen a game generally go between 45-60 completely steady like the video? Or is it more likely to be a solid 60fps then drop to 30fps for some big explosion then jump about all over the place.
If it's very steadily increasing from 45-46-47..... 58-59-60 then the time between frames stays very very close. When you start jumping from 30-58-42-64-35 then the difference in frame times will be drastically different to what they showed in the demo.
Ultimately vs a 120hz screen(in 120hz mode) you're talking about a circa 8ms maximum worst case scenario that the screen would be slower than g-sync, sometimes less(depends when the frame would be ready on the graphics card). The steady frame rate pacing that demo showed(that is completely unrealistic) gives absolute best case scenario of near identical frame pacing from one frame to the next so g-sync will be worse in real games with framerates all over the place. As g-sync gets worse, it starts to approach a 120hz screens max frame time wait anyway. As in if a frame isn't ready, a refresh happens, then it has to wait for the next refresh it's only 8ms away max, at 60hz that is doubled to just over 16ms meaning you miss one frame and it could be 32ms between new frames. When the framerate is so steady(from one frame to the next) then g-sync can give give frame times of say 16ms apart at 60fps, then 16.9ms at 59, 17.2ms at 58fps, etc, with a minor frame pacing difference between each frame all the way down to 45fps. That is what makes it look so steady while the 60fps monitor is consistently dropping 16ms between frames.
But when the frame rate goes 60 straight to 30 and back to 60, g-sync would give you frame times of 16ms - 32ms, 16ms. A 120hz screen would be.... the same. It would update every 16ms, but would refresh the same image twice(which would visually look no different, one point to make on that later) with the next new frame would appear 32ms later, and then the next frame at a 60fps rendering speed would be 16ms later.
One thing to mention is, overdriver/various speeding up monitor modes..... with a monitor set for really heavy overdrive(which generally overshoots and makes everything look a bit poo) assuming each refresh causes the algorithms to process, then each refresh can cause more distortion of the image, two refreshes of the same image can change due to overdrive, while potentially g-sync could prevent ghosting/overdrive artefacts by way of refreshing less often.
Aside from overdrive there is no graphical reason(only screen processing reasons) that a screen in vsync showing the same frame for 32ms refreshing twice would look any different to a g-sync showing one frame for 32ms. There is no flicker on LCD's, the refresh(without screen processing) would look identical for 32ms, as would the g-sync screen.
I'd actually be very interested to g-sync compared on two monitors that (I wonder if they can run only in g-sync mode though?) in normal mode one is noticeably worse with artefacts, ghosting/overdrive issues, if g-sync made a lot more difference on a screen that had worse image processing, I wouldn't be surprised if that was the case.
Ultimately I think g-sync could be a great move forwards, but it will be a massively smaller increase than people would think from that demo, that is 100% best case scenario for g-sync and they simply didn't use vsync in a way that mattered, and did so with a vsync capable of refreshing less than half as often as that particular screen was capable of. Put that demo at 144 fps and say vsync is terrible.
Also in that worst case scenario above it will offer identical lag to vsync, they are telling porkies about it reducing lag significantly. But I would think in general g-sync can't be worse than vsync, and it will most often be better, but very rarely as different as that demo tried to show.