Latency and the other elements don’t really matter for this discussion. This isn’t a thread about projectors v monitors. I was using the new generation of projectors as an example of how low nit and absence of local dimming doesn’t mean HDR cannot work and will be bad. The new technology in the new generation of projectors has made HDR worthwhile on low nit screens without local dimming and there is no reason why a new generations of monitors cannot follow the same path.
It’s not a good idea to just look at old monitors and go all future ones will be rubbish at low nit just because old ones where. It’s too early to write off this monitor for HDR. 450nit can be more then enough for good gaming HDR as long as the rest of the specs are good enough.
No, it's not a thread about projectors vs monitors, so don't mention them lol! THEY ARE A COMPLETELY DIFFERENT TECHNOLOGY lol!!
For a monitor to deliver satisfactory HDR, all it needs to do is achieve a 600cd/m2 peak level with local dimming and you've got something that's approaching OK. This isn't going to be the best HDR experience possible of course, but it's probably satisfactory for most. At lower levels with no local dimming, it will not. This is just a fact.
From the TFT Central article: "What is important though, without question, is the screens ability to improve the active contrast ratio. To do that, you NEED to have some kind of local dimming support from the backlight. Local dimming is vital to create any real improvement in the dynamic range of an image, and it is the support (or lack of in many cases) local dimming that is the main issue here."
This panel is not using future tech... it's pretty standard tech that is well understood. It is not too early to write off this monitor for HDR at all, and I'll put money on it right now not offering anything special at all in this regard. There are no other specs it has (we know the specs by and large) that will change this.
Last edited: