HDR monitors

Soldato
Joined
12 May 2011
Posts
6,298
Location
Southampton
I've been reading reviews of the 'xbones' as I will now call it, stating its HDR support. Now I've always thought of HDR as a photography thing- you point your camera a bit towards the sky and the landscape loses detail and colour, fading to black, whilst pointing your camera to the ground causes the sky to turn to brilliant white. HDR combines the best of both worlds, I assume by doing some constrast tricks, or taking two photos and combining them or something.

I understand that some TVs / Blu Rays and indeed the xbones are now HDR compatible. Is this simply a new standard/threshold a TV needs to meet in terms of number of colours/contrast ratio/brightness, or is there more technical stuff going on inside than that?

And ultimately are we going to see HDR monitors appear- are there already HDR monitors? And GPUs?

(I ask because the lack of contrast and dull blacks and lack of detail in dark scenes are a pet annoyance of mine on my monitor and TV)
 
I would like somebody to make an "HDR" monitor, even if there will be no support of HDR.
Thing is, to make it HDR they will have to use full-array LED backlight - and I was waiting for so long for somebody to do this for monitor.

I have no idea why no even most expensive monitors exist with full-array backlight - TVs yes, but not monitors. They don't have any "backlight bleed" unlike all these edge-lit ones - just because backlight bleed created by inherent difficulty of uniform lighting of large areas when all light sources located at edges - you either have dark splotches on white, or light splotches on black.

Funny thing, back when we had CCFL backlights, nobody heard of "backlight bleed" on large monitors - because they *had to do* CCFL array to light the large screen:
800px-LCD-TV_Backlight_with_CCFL.jpg


Nowadays they "manage" to lit even 32-34" and more solely by edge LED strips, which is ridiculous - no wonder there is horrendous bleed on these ultra wides.
 
Last edited:
Samsung is coming out with a lineup for gaming (CF791) and 4K Freesync monitors, all with quantum dot technology, so no bleeds and possibly HDR.
Assuming so, from their HDR TV which is quantum dot also.

However 2H 2016 drags all way to the end of the year. :(
 
I've been reading reviews of the 'xbones' as I will now call it, stating its HDR support. Now I've always thought of HDR as a photography thing- you point your camera a bit towards the sky and the landscape loses detail and colour, fading to black, whilst pointing your camera to the ground causes the sky to turn to brilliant white. HDR combines the best of both worlds, I assume by doing some constrast tricks, or taking two photos and combining them or something.

I understand that some TVs / Blu Rays and indeed the xbones are now HDR compatible. Is this simply a new standard/threshold a TV needs to meet in terms of number of colours/contrast ratio/brightness, or is there more technical stuff going on inside than that?

And ultimately are we going to see HDR monitors appear- are there already HDR monitors? And GPUs?

(I ask because the lack of contrast and dull blacks and lack of detail in dark scenes are a pet annoyance of mine on my monitor and TV)

HDR is fully supported on the 1*** series of nvidia gpu's along with the REC2020 standard

Theres only a handfull of tv#s that can do true HDR and thats OLED and some top end brands like pannys 60 tx60-902b that has around 500 honeycomb dimming zones each with it's own backlight.

All very well having all this tech but useless if game developers don't implement it.
 
Seems like another bit of dodgy marketing practices.

HDR is basically just increase in contrast ratio. However it seems they have built a standard around it, meaning that they get to sell you HDR "compatible" TV's, set top boxes, blu ray players and maybe even charge you extra to stream HDR content because it's not just better contrast, it's some other crap they've built in.

Or am I just being too cynical?
 
REC2020 is definitely not just "marketing".

Yes, HDR is "basically" just increase in contrast ratio, but it bit same as saying that "Impreza WRX is basically just a car" ;)

Contrast increase is quite massive, several orders of magnitude. And its not that "dynamic contrast", which is well-know marketing herring - no, its real static contrast.
In facts potential contrast becomes so big, that traditional 8-bit sRGB colour space won't cut it anymore - if trying to use it, you will see massive banding due to not enough precision. So both expanded gamut and more precision required - and this needed to be supported on transport (wire) level.

Hence new set of REC2020 standards and their support in e.g. DisplayPort 1.4

But, as I said, I would be very happy with using HDR monitor mainly for LDR (sRGB) content too (until HDR achieves wider adoption). Just because HDR monitor will have vastly superior backlight tech.
 
Last edited:
Edge lit is popular because you can make thin screens, no other reason.

Full matrix LED screens don't sell well because they are thicker, which is dumb.
 
In facts potential contrast becomes so big, that traditional 8-bit sRGB colour space won't cut it anymore - if trying to use it, you will see massive banding due to not enough precision. So both expanded gamut and more precision required - and this needed to be supported on transport (wire) level.

Thanks for clarifying. Good to know there are solid reasons for the upgrade in protocols.
 
Back
Top Bottom