I though free sync had more features enabled than g-sync, like everything Is sill enabled for free sync. G-sync doesn't but as a result can provide a higher refresh rate on some monitors
I'm way late replying to this which is slightly off topic now anyway, but figured I would anyway
Freesync has the big advantage of being an open standard. It's a slight change to the Displayport spec and can be implemented on almost any existing display scaler chip via a firmware change. This means that Freesync monitors, as any normal monitor, tend to have multiple inputs etc, and support all the features they did before the firmware change.
The downside is that not all scaler chips are ideally suited to this, as most were designed before Freesync was a thing, so you do get some cases were the monitor maker has slapped a freesync label on for some more sales but it only has, say, a 48-60hz freesync window, making it pointless. There are 1440p 144hz monitors that in Freesync mode, only go to about 90hz, making it really a 1440p 90hz monitor.
GSync on the other hand, is proprietary. This has the advantage of being owned and overseen by nVidia. If monitor maker (x) wants to make a GSync monitor, nVidia have to sign off on it or they can't make it. nVidia care about their GSync brand, so if you propose to make a GSync monitor with a 48-60hz range, nVidia will laugh at you and you won't be making a monitor today.
It also uses a custom designed nVidia scaler chip in place of the original scaler chip. This is partly how nVidia control it - we won't sell you scaler chips to put in to crap monitors. This has a disadvantage too - nVidia designed this chip for GSync, and the first version had no facility for secondary inputs etc, although later versions have. This is why most GSync monitors have only one display input. It also adds cost.
Until Crimson, nVidia had much better handling of the situation if you went below your variable minimum, as they doubled the frames and displayed them twice, bringing you back in the range. Crimson has fixed this for AMD cards, but only if your Freesync range is wide enough (as on say, a 48-60hz freesync monitor, double 40FPS and you get 80, which is still out of range.)
GSync effectively costs more for a much more polished solution. Freesync can be good but has an element of buyer beware about it as there are crap implementations.
Still, Apple manage to sell overpriced devices because they just work, so there is room in the market for both I think.