10 bit HDR

Associate
Joined
10 Feb 2011
Posts
181
So i have just realised I have a problem with HDR.

I updated to Windows 1803 a couple of weeks ago and have only just noticed that switching to HDR put's the bit depth to 8 bit with dithering instead of 10 bit like it used to.

So my question is how can I get windows to go to 10 bit without having to change the settings in the nvida control panel every time to 422 10bit?
 
Last edited:
Associate
OP
Joined
10 Feb 2011
Posts
181
Could anyone with HDR test this for me?

I have the Nvida settings on "use default colour setting" and when I toggle on HDR and go to settings>system>display>advanced display settings it says "8 bit with dithering, RGB" showing in bit depth and colour format.

It used to change to YCbCr442, 10 bit before the update.

BTW is this the right category to ask this question or should I have put it elsewhere?
 
Associate
OP
Joined
10 Feb 2011
Posts
181
I have a LG B7 OLED which is as far as I'm aware a true 10 bit TV.

Is there any way to test it? I can choose 10 bit YCbCr442 and that's works but would that mean its true 10 bit?
 
Associate
OP
Joined
10 Feb 2011
Posts
181
It's a 10bit panel.

I thought so thanks

So i need to find a way of making Windows change to YCbCr442, 10 bit then when it switches to HDR. I've had a look through all the Windows settings and can't find anything relating to the HDR depth and colour format.
 
Associate
OP
Joined
10 Feb 2011
Posts
181
I've tried a few things and still had no luck.

Could anyone else test it out for me and see what comes up when they play an HDR source in settings>system>display>advanced display settings
 
Soldato
Joined
24 Jul 2003
Posts
3,290
Location
South East Coast
Just tried on my C7. I tend to use nvidia colour settings instead of 'default' anyway to manually select 10/12 bit and Ycbcr etc but just quickly put it to default for testing purposes. Defaulted to 8 bit RGB (which was noticeably dimmer and not as colourful) and then turned HDR on via the Display Settings in Windows and it went to Ycbrc 422 and 10 bit colour. Using a 1080Ti. Not sure why yours isn't, possibly the cable? Although if it was working before not sure why it wouldn't now.

If you select 'Use Nvidia Colour Settings' in the NV control panel and select Ycbcr 422, 10 or 12 bit and save, then turn HDR on does it still change it back to 8 bit RGB?
 
Associate
OP
Joined
10 Feb 2011
Posts
181
Just tried on my C7. I tend to use nvidia colour settings instead of 'default' anyway to manually select 10/12 bit and Ycbcr etc but just quickly put it to default for testing purposes. Defaulted to 8 bit RGB (which was noticeably dimmer and not as colourful) and then turned HDR on via the Display Settings in Windows and it went to Ycbrc 422 and 10 bit colour. Using a 1080Ti. Not sure why yours isn't, possibly the cable? Although if it was working before not sure why it wouldn't now.

If you select 'Use Nvidia Colour Settings' in the NV control panel and select Ycbcr 422, 10 or 12 bit and save, then turn HDR on does it still change it back to 8 bit RGB?

Thanks for testing it for me :)

When I manually set it to YCbCr 442 10 bit it stays as that as does 444 8 bit when I'm on PC mode on the TV and i want full chroma, it's only when it's on the default which used to work perfectly that it puts this 8 bit dithering on. I'm at a loss tbh, I'll just have to do it all manually from now on which is a pain when it used to do to it all automatically but there isn't much i can do about it lol.
 
Soldato
Joined
24 Jul 2003
Posts
3,290
Location
South East Coast
Np. Windows and HDR are bad enough for randomness but to be fair if you just manually set in the Nvidia control panel panel to 10/12 Ycbcr 4:2:2 it will be better quality than RGB 8 bit anyway so just set it once and your done! :p
 
Back
Top Bottom