• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10 bit colour support.

Associate
Joined
15 Mar 2010
Posts
967
Okay so I'll soon be getting the U2711, a monitor that supports 10 bit colour/over a billion colours. However I'm assuming not everything will support the full range of colours this monitor offers and wanting to get the most out of this monitor I need to know what video cards/cables I should be using to do so?
 
I really wouldn't worry about this. The software support is necessary as well as the hardware support and it seems to be severely lacking. That's not to say things won't change at some point in the future though. 10-bit per channel colour (or 'Deepcolor') is only supported by HDMI 1.3 and above, Display Port and dual-link DVI. Currently I believe only the Nvidia Quadro range and ATI Radeon 5xxx or above currently supports this colour depth. I could be wrong though, but what I do know is that the software support is severely lacking at the moment. First off 'they' need to sort out broad-gamut support more widely and that is still some way off. By the time software catches up your U2711 will be pretty much obsolete technology I'd imagine - I reckon OLED will be a driving force between these changes in colour handling.
 
But say I did go out of my way to get the necessary hardware, all that would be is either HDMI 1.3, DVI-D or DisplayPort and a 5xxx or higher card or quadro? Just confirming
 
All nVidias GT200 and higher based cards support 10bit color (note GTS250 isn't included in this as its G92b) however the 65nm ones definitely do not support it on anything other than display port (which I don't think is present on any of the GeForce models) and I believe on later cards its left to the AIB to support it on the available outputs and AFAIK none have added support for it over DVI-D or HDMI but theres little documentation - its not something most people seem to know anything about.
 
All nVidias GT200 and higher based cards support 10bit color (note GTS250 isn't included in this as its G92b) however the 65nm ones definitely do not support it on anything other than display port (which I don't think is present on any of the GeForce models) and I believe on later cards its left to the AIB to support it on the available outputs and AFAIK none have added support for it over DVI-D or HDMI but theres little documentation - its not something most people seem to know anything about.

So display port is the most reliable connection for 10 bit on later cards?
 
Back
Top Bottom