Dolby Vision on Non-DV titles?

Associate
Joined
18 Nov 2009
Posts
318
Location
At my PC
Hey guys, just polling an opinion.
I picked up an LG C7V back in May, and a Sony UBP-X700 on Prime Day... First title I picked up in UHD was Infinity War, but I realised shortly after buying it, it's not actually a DV title.

I know the Player can "Force" DV as an output, but I'm wondering, is this actually a better way to watch the movie, over the regular HDR mode?

What do you do with movies in the same bracket as this?
 
Best way is to play them as native format. Anything forced to be something it isn't will look a bit off.

HDR is the core format. HDR10+ and DV sit on top of that as enhancements.
 
Best way is to play them as native format. Anything forced to be something it isn't will look a bit off.

HDR is the core format. HDR10+ and DV sit on top of that as enhancements.


Dolby Vision is never carried in an actual 12-bit stream. On disc it uses a 10-bit HDR10 3840x2160 base layer and
a 10-bit DV 1920x1080 enhancement layer containing the difference data and dynamic metadata, these two layers are combined to produce the 12-bit 3840x2160 Dolby Vision output
 
Dolby Vision is never carried in an actual 12-bit stream. On disc it uses a 10-bit HDR10 3840x2160 base layer and
a 10-bit DV 1920x1080 enhancement layer containing the difference data and dynamic metadata, these two layers are combined to produce the 12-bit 3840x2160 Dolby Vision output
Which is what I said, but in far fewer words ;)
 
DV layer is not an "enhancement" layer. ;)

DV is supposed to play at 4,000 nits and up. I think the DC TVs can only do about 1,500 nits.

I think you're so busy trying to prove a point that you've missed reading (or at least misread) my post.

I never said DV was an enhancement layer on a TV.

Even in your own post, you call it an enhancement layer!. This refers to the data on the disc or in the file. I don't know if you realise it or not, but we are talking about the same thing. Lol :D

As for the nit levels, that wasn't mentioned in the OP nor my replies.

Anyway, my advice stands: Play it in native format rather than asking the TV to make up an extended dynamic range that the original doesn't posses. :)
 
from the earlier hdr10 thread discussion, the dv discs are only being mastered up to max 4K nits atm, so the same as hdr10 also supports,
so I don't see what the 2bit enhancement layer carries, plus the non-commercial panels are (still?) only 10bits anyway.
so the choice between which to view comes down to hdr10 & dv implentation on your particular tv.

vincent (eotf) review suggested on hdr10 pan/sony oleds were better, and with dynamic contrast in hdr10 it can emulate the dv dynamic data

edit:
DV layer is not an "enhancement" layer. ;)
I thought he was being ironic (edit2 wrt his own post) that it is a mandatory layers afahc
 
from the earlier hdr10 thread discussion, the dv discs are only being mastered up to max 4K nits atm, so the same as hdr10 also supports,
so I don't see what the 2bit enhancement layer carries, plus the non-commercial panels are (still?) only 10bits anyway.
so the choice between which to view comes down to hdr10 & dv implentation on your particular tv.

vincent (eotf) review suggested on hdr10 pan/sony oleds were better, and with dynamic contrast in hdr10 it can emulate the dv dynamic data

edit:

I thought he was being ironic (edit2 wrt his own post) that it is a mandatory layers afahc


It is a "enhancement" coming from a 10p layer :)

You said "HDR is the core format. HDR10+ and DV sit on top of that as enhancements"

SDR is core layer, not HDR. Then they add a 1080p layer for data and dynamic metadata.
I.E there is HDR, then there is SDR+1080p layer on top to make DV :D
And as for HDR+, well no such thing at this time. And the Samsung TVs don't have many nits...well they do... in there support office :)

Picture quality is a serious business :D:D
 
Thanks for the explanation guys, I assume this means that leaving the DV setting "On" on my UHD player may be a detriment to PQ?
 
SDR is core layer, not HDR.
This can't be true as they are different colour spaces? I wouldn't have issues trying to play HDR UHD's on my 1080p TV and having the colours very washed out because it would just ignore the HDR metadata and play the "SDR core"... and I can tell you that does not happen.
 
This can't be true as they are different colour spaces? I wouldn't have issues trying to play HDR UHD's on my 1080p TV and having the colours very washed out because it would just ignore the HDR metadata and play the "SDR core"... and I can tell you that does not happen.

"The dynamic metadata generated to create the SDR grade can be used to render the Dolby Vision master on displays, which may offer a wide performance range."

https://www.dolby.com/us/en/technologies/dolby-vision/dolby-vision-white-paper.pdf

I've got both DV and HDR.
 
"The dynamic metadata generated to create the SDR grade can be used to render the Dolby Vision master on displays, which may offer a wide performance range."

https://www.dolby.com/us/en/technologies/dolby-vision/dolby-vision-white-paper.pdf

I've got both DV and HDR.
I feel like that quote is taken out of context and doesn't actually apply to this scenario. Can you tell me why then there are major issues with HDR - SDR colourspace conversion if the "core" is SDR and the HDR is just metadata on top? Or, am I misunderstanding entirely?
 
I feel like that quote is taken out of context and doesn't actually apply to this scenario.
yes, agree it is ;
the quote is from the perspective of broadcaster having a means to translate dv to sdr , via meta-data, for sdr broadcast,
however what is on a 4k disc is an hdr10 base layer, plus meta-data and the 2bit enhancement to get you to dv.

saying that - I thought I had read netflix distributes sdr and then meta-data to get to hdr, but maybe that compromises dv image quality,
(because you want to keep the stream size down & multi-purpose for sdr&hdr)
 
I think you need to read that post yourself and comprehend what is being said. There is no ''core SDR'.

"And it's worth reiterating that HDR10 is not just metadata, you can't simply strip out the data strings and miraculously be left with a watchable SDR image."
 

He might do, but it seems you're just copy and pasting without understanding.

I'll quote Geoff D directly, word-for-word
"To that end, the specific implementation of Dolby Vision on UHD Blu-ray is a dual-layer system made up of a 3840x2160 HDR10 base layer and a 1920x1080 Dolby Vision enhancement layer,"​

This goes back to what I said in the first reply of this thread.
 
What you just wrote makes no sense.

AFAICT, nobody in this thread has suggested 1080p is an enhancement to UHD, if that's what you meant.

We're very happy to have a conversation with you about the ins and outs of HDR 10 and the enhancement layers that are DV and HDR10+. If you want to reference someone else's thoughts on the subject then that's fine; either post a link or quote them and give the source accreditation. Just don't pass off someone else's technical descriptions as your own work, particularly when your own knowledge doesn't appear to be that strong. :)
 
Last edited:
Back
Top Bottom