Help me a choose a new 27" IPS monitor

Associate
Joined
22 Jan 2010
Posts
1,480
Thinking about getting a new monitor. Got 24" and 22" TN panels at the moment but I do a lot of graphics work and the viewing angles are getting annoying.

Was thinking about going up to a 27" which basically means the U2711 or the HZ27WA

Is there anything about the Dell that would justify the extra £200? Or is there anything else I should be looking at?
 
Last edited:
The main reason to get the dell is because it is a dell (great customer service, nice pixel fault policy, pretty reliable products).

The newly reformed hazro has much less of a track record and their pixel policy is much less generous (they will accept quite a few pixel faults before accepting a monitor for repair/replacement as faulty). So it's a bit more of a risk.

If you do go for a Hazro then I would suggest going for the even cheaper (£375) HZ27WC. This uses the same panel as the WA, but it doesn't use a scaler or have as many inputs - but if you look at this review it still calibrates very well and it has even less input lag than the WA, so it will feel a bit more responsive.
 
The lack of inputs is the only issue with the lower end model. I would be using the monitor for my PS3 to.

As far as the dead pixel thing goes, I suppose I can always send them back under DSR? Although the postage can't be cheap D:

What does the scaler do? Upscale images?
 
Last edited:
The lack of inputs is the only issue with the lower end model. I would be using the monitor for my PS3 to.

As far as the dead pixel thing goes, I suppose I can always send them back under DSR? Although the postage can't be cheap D:

What does the scaler do? Upscale images?

Yea, the scaler scales the images - if you are using the PS3 (which outputs at a maximum res of 1920x1080) then you certainly need your 2560x1440 screen to have a scaler.

As you say, you can always send it back via DSR, even a few returns at £20 a pop (guessing) would still be a fair bit cheaper than the U2711 - although quite a bit more hassle.
 
I could just continue using the PS3 with my 24" TN

Decisions are hard D:

TBH, that wouldn't be a bad idea. It would save you £105 and it would be playing on a monitor that is (presumably) the native resolution of the 1080p input signal (albeit with console upscaling in some games) instead of being scaled to fit a very different resolution panel.
 
im in the same position to decide on a 27"

i was trying to work out the difference in image quality between the cheaper hazro and the more expesive one (8bit vs 10bit)

would it be noticable ?
 
I think the main difference is in gradients, you shouldn't see as much/any banding with 10 bit.

On the other hand I don't think these are "true" 10 bit panels, they use some kind of filtering or something? Don't know how much difference that makes though.

Edit: Just read this, thought you might find it useful:

To start, 8-bit means that for red(r), green(g), and blue(b), the values 0 to 255 can be represented. For 10-bit, the rgb values can be from 0 to 1023. This means that per component, 10-bit is 4 times as detailed as 8-bit. Therefore, if you had a raw image with 10-bit depth, it would have a color palette 64 times as large (4x4x4=64) to represent the image on your screen. In the case of high definition video, with the exception of footage from EXTREMELY high end cameras (starting with the RedOne cinema camera and upward), you will never come across media of this scale. The reason is, it would require a signal of 3.125 gigabits per second to properly transmit this signal. TV networks with half million dollar cameras broadcast sports from the arena to the network at less than 1/3rd this speed, with quality loss.

The piddly 50mbit/sec that you get from high definition formats would almost definitely not benefit from higher bit depths as it already is stretching itself quite far by employing 150:1 compression to begin with.

The case where 10-bits for a consumer screen makes a big difference is in upscaling video from a lower resolution. Each color channel (red, green, blue), before scaling is multiplied by 4 to make it a 10-bit value to begin with. Then the image is scaled up by finding values inbetween "neighboring" pixels to jam in-between each pixel.

If you work in 8-bit, and you have a pixel with the value 1 and the pixel next to it is the value 2, then if you were to double the size of the image, the pixel inserted inbetween is calculated by adding the two values together, then dividing them in half. So, 1+2 = 3 / 2 = 1.5.

1.5 is not a valid pixel value. So, it would become either 1 or 2 since scaling systems are generally smart enough to use a more complex calculation that takes other pixels into account as well.

Using the same values, in 10-bit, therefore multiplying the 1 and 2 each by 4, we get the values 4 and 8 to start with instead. So 4 + 8 = 12 / 2 = 6. 6 is obviously a valid value, so now instead of the 8-bit version which would be either 1.1.2 or 1.2.2, we have a higher quality scaling of 4.6.8 instead.

The result is that the "sub-pixel-sampling" or the pixels in-between the encoded pixels are of a higher precision. The visible result, in special circumstances (generally you saw it more during the earlier jumps from 5 to 6 pixels per channel) is that color banding in the picture is much less.

The quality is even further improved when linear and temporal color scaling is taken into consideration. This is when the previous pictures and pixels around each pixel are used to help scale the current picture. So the scaler has as much data as possible to help it guess the new value of each pixel when scaling.

I know this is a bit too detailed for a forum like this, but I felt like jumping in.

To summarize, depending on the quality of the processor being used for scaling on the TV, it is possible to greatly improve the quality of a SD, 720p, even a 1080i (during the deinterlacing phase) picture on a 1080p screen using 10-bit color channel resolution since detail is filled in by guessing numbers for pixels that were not represented on the source media.

That being said, going from 16.7 million colors per pixel to a little over 1 billion colors per pixel is not as earth-shaking as it may sound. Thanks to motion in pictures, it's not likely to make a big enough different to matter, especially in the case of back-lit screens, but that's an entirely separate discussion.

http://www.avsforum.com/avs-vb/showthread.php?t=967859
 
Last edited:
thank you for taking the time to reply

im still a little confused as to if i would notice the difference. (or not)

ie . for photo editing.
 
I don't really know anything about photography, are the source images usually in a 10 bit format? If not then obviously there shouldn't be any difference :P
 
The HZ27WA does indeed use frame-rate control dithering to achieve the 10-bits per subpixel output. It isn't something that really matters considering the conditions that would have to be met to provide a true 10-bit output. Firstly you would require significant bandwidth; which could only be outputted at the Hazro's native resolution via DisplayPort which the monitor doesn't have. Secondly the GPU itself would have to support 10-bit colour, which rules out most current Nvidia GPUs. Thirdly the software itself would have to support such a colour depth and currently this support is very limited. There are far more important aspects of image quality to concern yourself with and the Hazro makes a good delivery on a lot of them.
 
Anyone know how good the scaler actually is? It would be nice to run my PS3 on the larger screen. TBH I'll probably just get the cheaper one, I hardly ever use my PS3 anyway.
 
Back
Top Bottom