but I am not sure either of us have enough knowledge to say exactly what the Gsync module requires.
Yes, you can't know how much I (a random person on the internet) understands about these issues, and you should be skeptical. That's always a good policy.
Note however that I'm not the one playing armchair engineer (although I actually am one). I've not claimed to know how nVidia's DP1.4 G-SYNC module can be made any cheaper. You have, by postulating that nVidia should make it cheaper by making an alternate "version" which omits support for HDR10.
I don't know how to fix nVidia's cost problem, because I don't know why nVidia made the design decisions they did.
What I DO KNOW is what HDR10 is (a protocol, defined by a set of standards, collectively also known as in
ITU-R BT.2100), and how irrelevant that protocol is in terms of contributing to the overall cost of the G-SYNC module. If you have any formal background in computing at all, you'll be able to read up on that and the DP1.4 standard and also conclude that this isn't what contributes to cost. I might be incorrectly assuming that you have some background in computing (otherwise you do seem to know a thing or two), so I was hoping that's what you'd do.
As far as the DP1.4 G-SYNC module is concerned, all HDR10 requires (in comparison to DP1.4 without HDR) is the following:
- adds ~25% to the DP bandwidth requirements (an extra two bits per pixel). In terms of cost this is irrelevant, because DP1.4 always transfers data from the GPU to the monitor at the maximal transfer rate, with or without HDR (for HBM3 data is always transferred at 25.92 Gbps). For bandwidth that isn't used the GPU just transfers zeros until the next frame is ready to be sent to the monitor.
- adds <0.1% for some HDR related information (information about what is being transferred)
That's it. All that is solved in software/firmware. In terms of cost, it just doesn't matter what is sent across the connection. Compared to DP1.2, all the extra cost is incurred by having to support higher bandwidths, but since transfer rates are fixed, that cost is incurred with or without HDR support.
HDR support impacts the cost of the monitor primarily by requiring a much more capable backlight assembly, particularly if that uses FALD and things like QLED or nano-IPS coatings. Those things are simply independent of the DP1.4 G-SYNC module.
I'm sorry to harp on about this. I've just noticed a lot of confusion here about what HDR is and how it works, which is further evidenced by statements like:
"the DP1.4 G-SYNC module doesn't support SDR"
Nonsensical, because SDR is a subset of HDR. You can't have HDR without also having SDR supported implicitly.
And 1000 nits on PC monitor is too much.
Nonsensical, because as others explained, peak brightness is reserved only for very small highlights. For some reason it's a very widespread misconception that a HDR monitor would bombard you with 1000 nits at all time, when not even televisions do that.
etc.
etc.
etc.
I was hoping to help improve that, and thought your point was as good a place to start as any. I'll leave it at that.
Otherwise, we fully agree that the current situation is unsustainable. If nVidia can't make their G-SYNC module less expensive then nVidia might as well eliminate G-SYNC now. If $500 - $800 is a correct estimate then it's simply priced way out of the market and isn't competitive at all.