• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

U2711 1.07 billion colours?

Try emailing dell. Will you actually nitice a difference though? Even if ut could only use 500,000,000 colours it would most likely be good enough. Ask yourself if it is worth the extra money upgrading monitor/gpu.
 
Try emailing dell. Will you actually nitice a difference though? Even if ut could only use 500,000,000 colours it would most likely be good enough. Ask yourself if it is worth the extra money upgrading monitor/gpu.

I did not buy a U2711 or GTX 570 because I am desperate for 1.07 billion colours.

The 'extra money' you are talking about, I paid for to get more performance and a better quality panel with higher resolution. The monitor was only 35% off Dell's RRP. £570 delivered.

U3011 (2560*1600) £1200

HP ZR30w (2560*1600) £1000

Hazro HZ30w (2560*1600) £800 < I really wanted this but it was slightly out of my budget.

Apple 27" (2560*1440) £850

I think the monitor was a pretty good deal for anyone in the market for a monitor of that spec.

I am looking to see if I can take advantage of some feature that came with the monitor, not to see if this feature was a good basis for my purchase.
 
To answer your question Cythx, you need displayport to get the 10bit colour processing.

Thank you (: I thought this would be the case.

Is there any perceivable difference between 16.7 mil and 1.07 bil colours for most people? I don't want to be missing out on much if I can just get a 6970 instead.
 
I've heard that it's noticeable and quite nice, but I haven't seen it myself.
 
Last edited:
HDMI 1.3 onwards does support it indeed :)
but the monitor doesn't support native resolution with HDMI :(

you'll need a professional video card as far as i can remember. i've had 5870 hooked up using displayport, and it won't display 10 bit colours, at least i couldn't find option to do it.

shame that dual-link DVI does not support 10 bit colour.
 
but the monitor doesn't support native resolution with HDMI :(

you'll need a professional video card as far as i can remember. i've had 5870 hooked up using displayport, and it won't display 10 bit colours, at least i couldn't find option to do it.

shame that dual-link DVI does not support 10 bit colour.

I'm pretty sure the 5870 does not support 10 bit colour. This is a limitation of the card.
 
"Support for HDMI 1.4a including GPU accelerated Blu-ray 3D support, x.v.Color, HDMI Deep Color, and 7.1 digital surround sound."

Is written under 'features' for all the GTX 5xx cards, so I'm pretty sure I am safe.
 
Thank you (: I thought this would be the case.

Is there any perceivable difference between 16.7 mil and 1.07 bil colours for most people? I don't want to be missing out on much if I can just get a 6970 instead.

You're not. Your monitor can't even display 1.07 billion colours, its an 8 bit panel with dithering. Marketing strikes again I'm afraid. Just stick with 16.7, its nothing like the jump from 6bit to 8bit.
 
Where did you find this monitor cannot display 1 billion colours? The product page says it does 1 billion colours, I don't see how it could be viewed any other way like the ambiguousness of saying a monitor is capable of 10 bit colour.

'Incredible performance & clarity: Enjoy up to 2560 x 1440 (WQHD) resolution, 1.07 billion colors' are they just lieing about the resolution and colour, or just the colour?

'Besides having a higher color gamut and different backlighting technology, Dell uses 12-bit internal color processing with the ability to output 10-bit color. That means you can get 1024 levels of grey instead of just 256, reducing the amount of banding present in certain situations. 24-bit vs. 30-bit color also means you get a color palette of 1.07 billion instead of 16.7 million' - Anandtech
 
Last edited:
2 x NO.
First no is because GTX 570 doesn't support 30-bit output. See Quadro (FX/CX) series.
Second no because you would need special software to render 30-bit colour picture. It's not like just connect and see.
So if you haven't heard about 30-bit software, it means you simply don't need it.

BTW, the human eye can distinguish about 10 million colors (of course, there are better and worse eyes):

http://en.wikipedia.org/wiki/Human_eye
 
Back
Top Bottom