ULMB was something that could be hacked on to 3dvision lightboost monitors, it is now officially supported by gsync monitors, so I see it as a feature of having a gsync monitor/module
if freesync monitors come out and have ULMB and 3D then fair enough, but otherwise I see it as part of the package... the point being that nvidia could add additional features to gsync module/monitors that aren't available on freesync monitors... I don't know what, but they have said they have extra plans
2560x1440 @ 144hz is also currently only supported on gsync, so that is also a unique feature currently, there isn't a standard monitor scaler that supports it
Obviously people are free to have their own interpretation of what "gsync" entails, but if consumers want a feature and that feature is only available on a gsync monitor, then they are buying a gsync monitor, the end result is the same regardless of semantics. Likewise if the only monitor available with x is a freesync one then it adds to the perceived sales of freesync monitors, regardless of whether they want it or even use it.
Hold on, this is nothing to do with semantics and it's just you moving the goal posts to suit your argument. Do any of the things you have listed(apart from one) have anything to do with the Gsync module?
The only feature that is specific to Gsync monitors that you listed is the resolution @144hz. 3D is on other monitors, ULMB is too, some of the latest BenQ gaming monitors have it. SO again, these aren't unique to Gsync monitors, and since you can't use both at the same time, they have even less to do with it.
Also, looking back on the other posts in this thread, I think you are been overly negative about adaptive sync. Take the price, you mention that the price for adding Gsync is negligible, but it isn't. There is the cost of the FPGA, there is the cost of the scaler and lastly the cost of the custom made PCB onto. This is then sold to the monitor manufacturer.
Whereas with adaptive sync there is only the cost of the scaler, and even that will be cheaper because they won't be buying it from a third party but direct from the manufacturer.
I don't see how gsync and adaptive sync monitors will ever be the same price unless Nvidia sell the module at a loss.
AS for market share, well, while only a small number of discrete AMD cards support freesync you have to remember that all GCN APUs do support it. So there is a big market out there.
Will Intel use adaptive sync? Their 4th generation intel core and M core GPUs all have the hardware controller (eDP1.2) needed to connect to an adaptive sync monitor. All they would need to do is write a driver. So the question isn't will they use it? but why wouldn't they use it?