This is spot on. What G-Sync does is stores the last rendered frame in the modules onboard memory and when the game is running at less than what the monitor can refresh (35fps for instance), it will send the last drawn frame as well as the new one and this in turn would give 70 fps. The same with 20 fps would give 40 and 15 would give 30 and this works all the way down to 1fps (not that 1 fps is playable)
Freesync hits the monitors lowest refresh rate and then it is just what you get that is delivered. so 20 fps IS 20 fps (unlike G-Sync giving 40 fps). I was blown away when I played Batman AO at 5K resolution with all the bells and whistles and it resulted in completely smooth gameplay (as seen from my video). I found it hard to believe that I was only playing at 25fps in truth, as everything was ridiculously smooth. I am also very susceptible to stutter and hitching and tearing and there was none of that at all.
While Freesync is new and they can possibly do this at some stage using the GPUs memory, this is a massive plus for nVidia that many people overlook. It needs to be seen in person to see how smooth it actually is.