So, what would be preferable from now on? Having a series of Gsync HDR monitors at £2000+ while their Freesync syblings are at half the price?
Or you want the prices to be diluted and having the Gsync monitors being sold at loss (~£1500) and the Freesync at inflated prices to cover up the damages, effectively having the Freesync users paying the costs of the Gsync modules?
Your first proposition implies G-SYNC (based on DP1.4) will never command more than a minuscule share of the market. However, without significant market penetration, G-SYNC becomes irrelevant to consumers (because cost), monitor OEMs (because revenue) and ultimately also to nVidia, which would spell the end of G-SYNC as a technology overall. It's unlikely nVidia thinks that's a good idea. Your latter proposition violates antitrust laws to a degree that seems comical, even by today's levels of corporate corruption. Obviously, neither is preferable to anything.
Answer me this:
Are you convinced that nVidia's latest G-SYNC module can
NOT be delivered at a lower cost
WITHOUT sacrificing user-facing features (HDR, refresh rates, DP1.4 compliance, etc)? It seems to me that is what some folks here assume. Based on your propositions, it
seems you're assuming the same, namely that nVidia's current implementation is as cheap as G-SYNC (based on DP1.4) gets. I think that's BS.
You've asked me "what is preferable". The answer is simple: nVidia should just stop using such a ridiculously expensive FPGA. I know enough about hardware and software development to say there is absolutely no
technical reason which forces nVidia to use an FPGA. I don't know what drove nVidia to make that choice, but suspect their reasons boil down to economics and/or risk aversion. Since I'm not privy to their thinking, there isn't much point to me speculating on what those economic or risk related issues may be, much less on how to solve them.
If forced to speculate about a solution based on my currently incomplete knowledge, I'd say nVidia should do the following:
Throw out the current FPGA based design. Replace it with a module that functions as a complete scaler package which is built as an ASIC. Sell the ASIC as a direct competitor to the scalers offered by Realtek, Novatec or MStar (it's from these three companies that monitor OEMs purchase the "modules" which receive the FreeSync signals from AMD GPUs for about $2). If it was up to me, the unique selling point of nVidia's scaler would be its ability to support both FreeSync and G-SYNC. G-SYNC support could be disabled in firmware. With G-SYNC disabled, the scaler would be sold to OEMs at the same price as the competitor's scalers. This would allow monitor OEMs to sell the exact same monitor in a G-SYNC or FreeSync variant. All the differences would be limited to firmware. This would bring the cost of the G-SYNC model down to a price much closer to that of the FreeSync model. Technically, this would even allow a single monitor to support FreeSync and G-SYNC simultaneously, which would be great for consumers, but that's probably not in line with nVidia's market strategy.
At least technically, that is realistic. I don't know if it's realistic economically.
Either way, it seems inconceivable to me that nVidia believes their G-SYNC module (based on DP1.4), at its current price, represents a product that is viable in the long term. However nVidia chooses to reduce the module's cost, which we all believe they must, eliminating HDR support isn't going to be beneficial to them in that regard. Assuming DP1.4 compliance is a given, reducing bandwidth also isn't an option. Your propositions aren't solutions either. As far as I can tell, nVidia's only hope is to eliminate the need for the FPGA and replace it with an ASIC. There are multiple ways to go about that, with the differences between them being related to where the G-SYNC features are integrated (into the GPU, the scaler, or both). What I've suggested above is just one way.