I think you're missing the actual point in the HDR we're talking about. For all intents and purposes, we're talking about maximising contrast. Which is where fall array local dimming comes into things to enable proper HDR. I feel like you're misunderstanding something, as that isn't something you can achieve with just software. The hardware has to be able to physically produce the extended range from dark to bright that comes with the HDR spec.
I agree.
Please take a breath and look at what you saying.
1 - game developer uses 6 bit colours as thats compatible with every display
2 - owners of 8/10 bit panels have colours not used as a result.
3 - some other developer makes software to use the unused colours and as a result contrast is increased, more vibrancy etc.
The hardware you talk of in these situations is already present, just not utilised without HDR.
There is no specific hardware needed for actual HDR processing, the hardware required is just hardware that can output the extended range of colours. So basically the only requirement for 10bit HDR is a 10bit panel.
Of course if you are a commercial company wanting to maximise your revenues, you will want to make HDR a profitable feature, and as such you convince people they need a new tv for it. You make deals with companies that output to tv's (such as console manufacturers) that they will only enable HDR on HDR marketed displays. This can be achieved by checking for the presence of a so called HDR chip.
This is not complicated, its happened numerous times in the past and will happen again.
Its a similar thing with 3d graphics as well.
A cpu can generate the same visuals as e.g. a 1080TI, the difference is it does it a "lot" slower.
Early 3d games made a software 3d mode available where the game was still 3d but with lower performance. Now days no such software mode is available, instead games will just check for the presence of a GPU that can accelerate 3d graphics.
GSYNC is a very recent example as well.
Nvidia chose a path that required a chip, display companies jumped on it as they could sell monitors at a premium, now freesync has came out, which ironically also has a premium attached but is a smaller premium, nvidia cannot just start supporting freesync as they very likely have commercial agreements with the GSYNC display manufacturers to make sure demand is kept for those products.
It is all about money.