HDR10 and DV question

Soldato
Joined
12 Sep 2003
Posts
10,780
Location
Newcastle, UK
Hi people who are more knowledgeable than me. :D

So I'm giving Netflix 4k a try. My TV supports HDR10 and Dolby Vision. As the app runs on the TV am I right in thinking I can playback either HDR option?

Reason I ask is that my TV is connected to my Denon X2400H AV receiver, which can only do DV. Specs don't list HDR10.

So I think that doesn't matter due to streaming Netflix direct from the TV? Would I be right in saying?

Second question. I am about to buy a 4K UHD bluray player which will connect to the same receiver - the Panasonic DP-UB820EBK. As that supports both HDR10+ and DV would I be limited to DV only titles, as the AV receiver doesn't list HDR10 support?

Could I possibly buy a 4K bluray, and not be able to play it if it wasn't Dolby Vision? :confused:

Thanks for any help!
 
HDR10 is the bread of any HDR sandwich. So whether you go for the Fully Monty bacon butty that is Dolby Vision, or opt for the cheese and pickle of HDR10+, you've always got HDR10 as a base on which those extra fillings sit.

Any AV receiver with a HDMI 2.0a (or better) HDMI input will be compatible with HDR10. Whether it then goes on to support HDR10+ or DV depends on the manufacturer including those features, and in the case of DV, paying the Dolby licencing fees.

HDR breaks down like this:

  • Broadcast HDR; terrestrial TV (if it ever goes UHD), Sky, Satellite, Virgin media - all are supposed to use a static metadata format for HDR called HLG
  • Streamed video and physical media - the base standard is a static metadata format called HDR10.
    • HDR10+ is an open source (no licence fee) dynamic metadata format based on 10-bit video proposed by Samsung. Some manufacturers and streaming sources have taken it up. Others haven't
    • Dolby Vision is a licenced dynamic metadata format based on 12-bit video

Static metadata means that the film's dynamic range is better than SD and HD video with its 8-bit video data (SDR), but the extra range is fixed for the range from black to white. Dynamic metadata means that the range is bigger than SDR but the effect can be applied scene by scene, even frame by frame, so where the the picture content is mostly dark like in Alien where they're creeping around the dimly-lit bowels of the spacecraft looking for the scary alien predator then there are more shades of brightness available within that scene to make the details stand out and avoid the steps in shading from limited bandwidth. Later, when the ship explodes and the scene is mostly bright, then the dynamic range can alter to maximise the tonal differences for a very bright scene. When the shots mix between the external explosion and Ripley's experience of that inside her fleeing shuttle craft then the metadata alters scene by scene and shot by shot to get the best from each frame.

HDR10 uses 10-bit video with static metadata. HDR10+ takes that 10-bit video and makes it so that it can adapt dynamically, but it's still 10-bit. DV is 'upto' 12-bit video with shot-by-shot dynamic capability.
 
HDR10 is the bread of any HDR sandwich. So whether you go for the Fully Monty bacon butty that is Dolby Vision, or opt for the cheese and pickle of HDR10+, you've always got HDR10 as a base on which those extra fillings sit.

Any AV receiver with a HDMI 2.0a (or better) HDMI input will be compatible with HDR10. Whether it then goes on to support HDR10+ or DV depends on the manufacturer including those features, and in the case of DV, paying the Dolby licencing fees.

HDR breaks down like this:

  • Broadcast HDR; terrestrial TV (if it ever goes UHD), Sky, Satellite, Virgin media - all are supposed to use a static metadata format for HDR called HLG
  • Streamed video and physical media - the base standard is a static metadata format called HDR10.
    • HDR10+ is an open source (no licence fee) dynamic metadata format based on 10-bit video proposed by Samsung. Some manufacturers and streaming sources have taken it up. Others haven't
    • Dolby Vision is a licenced dynamic metadata format based on 12-bit video
Static metadata means that the film's dynamic range is better than SD and HD video with its 8-bit video data (SDR), but the extra range is fixed for the range from black to white. Dynamic metadata means that the range is bigger than SDR but the effect can be applied scene by scene, even frame by frame, so where the the picture content is mostly dark like in Alien where they're creeping around the dimly-lit bowels of the spacecraft looking for the scary alien predator then there are more shades of brightness available within that scene to make the details stand out and avoid the steps in shading from limited bandwidth. Later, when the ship explodes and the scene is mostly bright, then the dynamic range can alter to maximise the tonal differences for a very bright scene. When the shots mix between the external explosion and Ripley's experience of that inside her fleeing shuttle craft then the metadata alters scene by scene and shot by shot to get the best from each frame.

HDR10 uses 10-bit video with static metadata. HDR10+ takes that 10-bit video and makes it so that it can adapt dynamically, but it's still 10-bit. DV is 'upto' 12-bit video with shot-by-shot dynamic capability.

Yep understood, thanks for the summary. :)
 
HLG doesn't actually use metadata at all, the first part of the signal is SDR and the rest is HDR and the TV simply clips off what it can't display. So you only need to broadcast HLG and even people with an SDR TV can still watch it as they'll only see the SDR part.
 
Also Netflix and Disney apps etc are clever and will automatically show you in the intro page what version it will play, so if it senses the connected device can do DV it will show that in the intro page. I found this out when I got my new TV (LG CX) and Disney started giving me the DV option when my old 4K TV didnt.
 
Back
Top Bottom