• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Enabling Full HDR via Nvidia Control Panel

Soldato
Joined
26 Aug 2012
Posts
4,419
Location
North West
I have an LG OLED TV with 10 bit colour and HDR support.

The TV HDMI input is passed through a soundbar which supports full Dolby Vision/HDR/4K etc.

Set up works fine on Apple TV and other devices.

For some reason In Nvidia control panel it doesn’t let me select RGB colour, 10 or 12 bit colour and only allows limited dynamic range mode on.

Running windows 10 upto date and latest Nvidia drivers. I’ve tried with a 1080ti and Titan xp, with no luck. The option did appear once and no longer appears.

Does anyone have any ideas ?
 
So should I set it to 4k 60hz in resolution and then tell Nvidia control phel to use default colour settings?

Then should I enable hdr or not in windows 10 display settings. So unsure how to get best picture.

If I turn on ycbcr422 and set output colour to 12 bit (can’t set 10 bit and don’t want to use 8 bit) it then caps the refresh rate st 30hz.

When I turn it to 60hz it puts it back to 30hz
 
Last edited:
There is no official 10-bit colour support with GeForce. If you want a 10-bit colour output, turn to the professional products or to any Radeon graphics card.
 
Ok so somehow I have desktop colour depth at 32 bit, colour depth at 12 but, output colour at rgb and output dynamic range at full.

Now all I can’t get is 60hz?

I think windows and Nvidias hdr settings conflict somehow
 
Pretty sure 10bit has worked for years in DirectX.

No, the default nVidia setting is to limited RGB 16-235.
That isn't even full 8-bit colour output.

Ok so somehow I have desktop colour depth at 32 bit, colour depth at 12 but, output colour at rgb and output dynamic range at full.

Now all I can’t get is 60hz?

32-bit is 8-bit multiplied by 4 (red, green, blue and alpha). You need 48-bit to have colour depth 12-bit.
 
I had to use the mini dp input on my monitor to get 10bit full range HDR. It looks a load better than the HDMI input which seems to top out at 8bit 4:2:2. I think I have the only screen in the world that can do this though. I screen mirror so I can flip between the max possible on DP and HDMI quickly and DP with the 10bit uncompressed is way richer.
 
Ok thanks.

So why doesn’t it present 12bit?

What do I need to do to get the best possible picture at 60hz?

12bit would be there if you have a monitor with 12-bit or 68.71 billion colours.
I haven't seen such a monitor in existence, let alone in sale for the mainstream.

There is no consistent support by Microsoft and Windows, nVidia and the monitors' manufacturers.
In order to push forward something, they need some type of a standard like the HDR.
But HDR requires standard 10-bit colour which nVidia doesn't offer (or if they offer, it happens through hacks and on select and rare monitors).

If I were you, I would call Microsoft's Customer Service and nvida Customer service. Explain them well what devices you have and what you want, they will tell you whether it's possible at all.
 
If I turn on ycbcr422 and set output colour to 12 bit (can’t set 10 bit and don’t want to use 8 bit) it then caps the refresh rate st 30hz.

YCbCr422 10 bpc works fine for me in NV control panel on my LG OLED. I've played 10bit HDR films via my PC and it's correctly outputting in full HDR.

If so, the OP needs to check his HDMI cable, maybe change it with another one and try all the other HDMI ports.
 
HDR is broken on Nvidia cards unfortunately and whoever says it’s working fine for them either lie or can’t see awful colour banding.
The best option (still broken) is 12bit 4:2:0 as this mode suffers from least banding.
 
HDR is broken on Nvidia cards unfortunately and whoever says it’s working fine for them either lie or can’t see awful colour banding.
The best option (still broken) is 12bit 4:2:0 as this mode suffers from least banding.

I've tried it and the screen just randomly flickers, it's a struggle getting 4K @ 60Hz to be honest, had the issue with my GTX1080 and now with laptop and GTX1070, Samsung MU6400 49".
 
Back
Top Bottom