10bpc vs 8bpc?

Caporegime
Joined
17 Mar 2012
Posts
49,563
Location
ARC-L1, Stanton System
I just replaced my old screen with this shiny new one and found a difference in the settings, the new one has what i think is 10Bit Colour depth vs 8Bit on the old one.

Switching between 8 and 10Bit in the CP i can't say that the difference is obvious, at all, maybe i'm blind :D

Can someone explain to me what benefits 'if any' i should be getting?

Thanks :)
 
it would only (maybe) make a difference if you had 10-bit content to send to the screen, and had the relevant 10-bit supporting applications, software, graphics card, operating system etc. unless you've got a professional grade graphics card then 10-bit support is not offered for pro applications like Photoshop etc so you wouldn't be able to use it there.

if you've got a general consumer GeForce or AMD card then it almost certainly supports 10-bit content for gaming. but you'd still then need an actual 10-bit game to be able to make use of it at all. even then, its often hard to spot any real difference compared with 8-bit for many people :)

Ah i see, thank you for that :)
 
It's also often the case that many 10-bit panels are actually 8-bit+FRC (frame rate control), which is a dithering effect that uses slightly different shades of colour with successive frames to create the illusion of more colours. This seems to be quite permissable, i.e calling a panel 10-bit when technically it isn't... well, not true 10-bit anyway.

Right, a bit like contrast ratios.
 
Back
Top Bottom