I'm not sure why Nvidia would support adaptive sync. The gsync solution is clearly better, it just works, and has a better range.
Except it is not... as PCM2 has said:
No.
They are both equally as effective at eliminating tearing, juddering and stuttering from the traditional refresh rate and frame rate mismatches. I've seen some fairly good pixel overdrive implementations for both FreeSync and G-SYNC models and also some not so good ones. And if there are any latency differences they're certainly beyond my sensitivity for that sort of thing.
And plenty of other tests/feedback from "non-bias" people/reviewers that say pretty much the exact same.
The freesync range depends on the monitor used and either way, according to PCM2, it really doesn't matter since AMD have LFC and as I have said many times before you really don't want to be hitting any lower than 40 fps max anyway, g/free sync is not a silver bullet like what many make it out to be.
Low frame rates are low frame rates regardless. You really don't want to be getting under 56fps or anywhere near that on a 144Hz monitor, it feels and looks extremely sluggish. And besides, now that AMD has LFC the frame rates below the hardware floor (56Hz/56fps) are suitably compensated for to remove stuttering and tearing. It's very much a non-issue really.
Either way adaptive/free sync, whatever you want to call it is not going anywhere, not when it is an open standard and will be used by intel in the future as well as HDMI having adaptive sync. The only way nvidia would ever be able to avoid adaptive sync is to not include anything higher than DP 1.2 in their future GPUs (which means cutting of a lot of future high end monitors i.e. 4k with 100+HZ), paying monitor manufacturers to disable it on the firmware side (unlikely especially when intel will have support) or disabling it via their own drivers (no doubt some kid will write a program to enable it though)
Gsync will die, it is only a matter of time, I can see it becoming a niche product and only used on the top end monitors, it will just end up the same way that nvidia 3d vision went.