• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Enabling Full HDR via Nvidia Control Panel

Soldato
Joined
26 Aug 2012
Posts
4,332
Location
North West
I have an LG OLED TV with 10 bit colour and HDR support.

The TV HDMI input is passed through a soundbar which supports full Dolby Vision/HDR/4K etc.

Set up works fine on Apple TV and other devices.

For some reason In Nvidia control panel it doesn’t let me select RGB colour, 10 or 12 bit colour and only allows limited dynamic range mode on.

Running windows 10 upto date and latest Nvidia drivers. I’ve tried with a 1080ti and Titan xp, with no luck. The option did appear once and no longer appears.

Does anyone have any ideas ?
 
Soldato
OP
Joined
26 Aug 2012
Posts
4,332
Location
North West
So should I set it to 4k 60hz in resolution and then tell Nvidia control phel to use default colour settings?

Then should I enable hdr or not in windows 10 display settings. So unsure how to get best picture.

If I turn on ycbcr422 and set output colour to 12 bit (can’t set 10 bit and don’t want to use 8 bit) it then caps the refresh rate st 30hz.

When I turn it to 60hz it puts it back to 30hz
 
Last edited:
Permabanned
Joined
2 Sep 2017
Posts
10,490
There is no official 10-bit colour support with GeForce. If you want a 10-bit colour output, turn to the professional products or to any Radeon graphics card.
 
Soldato
OP
Joined
26 Aug 2012
Posts
4,332
Location
North West
Ok so somehow I have desktop colour depth at 32 bit, colour depth at 12 but, output colour at rgb and output dynamic range at full.

Now all I can’t get is 60hz?

I think windows and Nvidias hdr settings conflict somehow
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
Pretty sure 10bit has worked for years in DirectX.

No, the default nVidia setting is to limited RGB 16-235.
That isn't even full 8-bit colour output.

Ok so somehow I have desktop colour depth at 32 bit, colour depth at 12 but, output colour at rgb and output dynamic range at full.

Now all I can’t get is 60hz?

32-bit is 8-bit multiplied by 4 (red, green, blue and alpha). You need 48-bit to have colour depth 12-bit.
 
Soldato
Joined
30 Jul 2006
Posts
3,076
Location
4090 on 850w = BOOM
I had to use the mini dp input on my monitor to get 10bit full range HDR. It looks a load better than the HDMI input which seems to top out at 8bit 4:2:2. I think I have the only screen in the world that can do this though. I screen mirror so I can flip between the max possible on DP and HDMI quickly and DP with the 10bit uncompressed is way richer.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
Ok thanks.

So why doesn’t it present 12bit?

What do I need to do to get the best possible picture at 60hz?

12bit would be there if you have a monitor with 12-bit or 68.71 billion colours.
I haven't seen such a monitor in existence, let alone in sale for the mainstream.

There is no consistent support by Microsoft and Windows, nVidia and the monitors' manufacturers.
In order to push forward something, they need some type of a standard like the HDR.
But HDR requires standard 10-bit colour which nVidia doesn't offer (or if they offer, it happens through hacks and on select and rare monitors).

If I were you, I would call Microsoft's Customer Service and nvida Customer service. Explain them well what devices you have and what you want, they will tell you whether it's possible at all.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
If I turn on ycbcr422 and set output colour to 12 bit (can’t set 10 bit and don’t want to use 8 bit) it then caps the refresh rate st 30hz.

YCbCr422 10 bpc works fine for me in NV control panel on my LG OLED. I've played 10bit HDR films via my PC and it's correctly outputting in full HDR.

If so, the OP needs to check his HDMI cable, maybe change it with another one and try all the other HDMI ports.
 
Associate
Joined
31 Dec 2008
Posts
2,284
HDR is broken on Nvidia cards unfortunately and whoever says it’s working fine for them either lie or can’t see awful colour banding.
The best option (still broken) is 12bit 4:2:0 as this mode suffers from least banding.
 
Soldato
Joined
3 Jan 2006
Posts
24,953
Location
Chadderton, Oldham
HDR is broken on Nvidia cards unfortunately and whoever says it’s working fine for them either lie or can’t see awful colour banding.
The best option (still broken) is 12bit 4:2:0 as this mode suffers from least banding.

I've tried it and the screen just randomly flickers, it's a struggle getting 4K @ 60Hz to be honest, had the issue with my GTX1080 and now with laptop and GTX1070, Samsung MU6400 49".
 
Back
Top Bottom