10-bit colour... Yay or Nay?

Associate
Joined
7 Dec 2010
Posts
49
Location
Cornwall
I'll be getting a 10-bit Eizo CG223W monitor in the coming weeks to replace a dead NEC Spectraview, but don't currently have a setup that will output 10-bit colour. Does anyone use or has anyone used a 10-bit display (with a 10-bit graphics card such as the ATI FireGL or Nvidia Quadro)? Is the difference worth the cost of upgrading the graphics cards in our computers? The monitor is for a large format printing business so colour quality is of paramount importance. We are also using the 8-bit Eizo CG222W and the quality is excellent. We are hoping the new one will follow suit.

Thanks,

Ben.
 
Man of Honour
Joined
12 Jan 2003
Posts
20,568
Location
UK
well a true 10-bit workflow is very rare still, even if you have a supporting graphics card, monitor and interface you have limitations of the operating system and software to think about as well. Apart from at a VERY high end, i'm not sure if it's really practical or even attainable at the moment.

also, although it's a very similar end result, i dont believe the S-PVA panel in the CG223W is a true 10-bit module, or in fact whether it's even an 8-bit+AFRC module as most of the "10-bit" models are using (NEC PA series etc). There's very few "true" full 10-bit modules anyway. I expect the panel itself is an 8-bit module, and rather than the additional FRC being done at a panel level, it's being done at the electronic side of the monitor as opposed to the panel side. Like some of their other models, the screen probably supports 10-bit input over the DisplayPort input which is then processed by ASIC (developed by EIZO NANAO), which contains the FRC circuit for displaying 10-bit input data on the 8-bit LCD panel.

From a technical point of view is practically the same result on the 8-bit display, regardless of whether the FRC is placed on the electronic side of the monitor or on the LCD-panel side. More important for the quality of the displayed data is the LUT-preparation process (quality of the algorithm, depth of the input LUT and output LUT, etc.). In this case there is at least a 12-bit LUT and 16-bit internal processing so i'm sure it's of a high standard. Just some technical points i guess to be aware of :)
 
Associate
OP
Joined
7 Dec 2010
Posts
49
Location
Cornwall
Thanks very much for the detailed reply. This helped a lot. I don't think I'll concern myself too much with 10-bit colour just yet. From what I understand now, the expense incurred would not justify the improvement in quality. I'll still be getting the monitor as it has many other desirable features and my experience thus far with the previous model, the CG222W, has been very positive. When the time comes to build our next computer, I'll probably incorporate an ATI FireGL or Nvidia Quadro anyway, but we definitely won't get a new computer just because we have a new monitor.

Thanks again,

Ben.
 
Associate
Joined
8 Oct 2004
Posts
2,283
In your case it may be worth it one day down the line but perhaps not now.

One thing artists should consider is that if you switch over to 10-bit all your end clients probably will still be on 8-bit (with most likely shoddy screens), so even if your work looks fantastic at your end it's still going to look worse at the other end. This is for people like me who work purely digitally and don't print very often. Same goes with colour calibration, although that's crucial, obviously.
 
Back
Top Bottom