Benq GW2450HM - too good to be true?

Associate
Joined
12 Jul 2010
Posts
30
Hi guys, has anyone tried the new Benq GW2450HM screen just released? I am tempted to get it, as it has:

8-bit true color (unlike e-IPS panels with 6-bit+FRN)
1080p
24inch (unusual instead of 23inch)
AMVA panel instead of IPS
Low response time/input lag
Wide viewing angles

See for yourself:

http://www.tftcentral.co.uk/reviews/benq_gw2450hm.htm

Huh? Is this right? A true-colour screen with low response time for games? I was beginning to think there weren't any after all the reviews I've read. I was focussing so much on IPS panels, I didn't consider an AMVA. But I'm thinking there must be a downside, like these awful LED backlights/clouding/buzzing/flicker etc. Anyone try this monitor and what did you think? Thx for any feedback.

P.S. I currently have a Dell U2311H, and it is very good in most aspects (especially response time, no lag/ghosting at all). Just a shame it's 6-bit+FRN colour, but the colour still looks good to me. But I can't help feeling I'm missing out on something without 8-bit (colours in games sometimes lack a tiny bit of realism/pazzaz). I will definitely keep this as a second monitor though if I get the Benq.
 
Well I knew there had to be something, and I'm betting it's the PWM flicker! I haven't tried an LED backlit screen so I don't know how bad this is. Does it bother people @ 255Hz when brightness is turned down? I'm wondering if the flicker is noticeable when browsing the web/playing games? I've been reading about people getting eye-strain with LED monitors. Or could you leave brightness on maximum (on the monitor), and turn down brightness via the graphics card menu?

These LED backlights are annoying, why can't they just stick with CCFL? It's like a huge backward step to the old CRT flicker again! I wish they did the exact same Benq with CCFL backlight. :(
 
Last edited:
The uniformity of the backlight is very good, it doesn't buzz and produces little heat. It runs relatively efficiently and few people have problems with PWM flickering - some people found the frequencies of CCFLs or the interaction of their phosphors nasty as well and that is something you will need to judge for yourself. This review of the 27" model is worth a read as well to focus your thoughts.
 
Well I knew there had to be something, and I'm betting it's the PWM flicker! I haven't tried an LED backlit screen so I don't know how bad this is. Does it bother people @ 255Hz when brightness is turned down? I'm wondering if the flicker is noticeable when browsing the web/playing games? I've been reading about people getting eye-strain with LED monitors. Or could you leave brightness on maximum (on the monitor), and turn down brightness via the graphics card menu?

These LED backlights are annoying, why can't they just stick with CCFL? It's like a huge backward step to the old CRT flicker again! I wish they did the exact same Benq with CCFL backlight. :(


As the review author I'll add my 2p if i may


PWM flicker is not the same as old CRT flicker, and to say LED is a huge step backwards would be unfair. They offer slimmer screen profiles, lower energy consumption, lower heat output and cheaper manufacturing costs for a start. It would be very rare to actually "see" a flicker of the backlight on a screen using PWM and the effects are more likely to manifest themselves through eye strain in some people. dont forget though that many many people don't find any issue with it, and PWM has been around for years as a reliable backlight dimming technique. unless you've had problems with it before on other screens, i wouldnt worry about it too much. If you really wanted to you could run the screen at a higher brightness setting to increase the duty cycle and reduce the potential issues. Running the screen at 100% brightness removes the need for PWM completely on that model and you could in theory reduce digital brightness via the graphics card. Keep in mind that it would impact contrast ratio if you did.

i also wouldn't get at all hung up on the 6-bit + FRC vs 8-bit colour depth question here, as in reality you are probably never going to see a difference in pratice. modern FRC techniques are very efficient and unless you were using the screen for high end graphics work, working with a lot of gradients for instance (in which case you probably would be looking at a higher end screen anyway) you just aren't going to see an issue with its use. It's a bit of an irrelevant point really that some people get hung up on unnecessarily.

If you're looking for a drawback to the GW2450HM there are a couple though worth mentioning. While they have finally improved responsiveness of the AMVA panel, the viewing angles still remain pretty restrictive unfortunately. Thats inherent to the technology and there are fairly noticeable gamma and contrast shifts across the screen. you also have to live with the off-centre VA gamma shift issue described in the review. The viewing angles are not as wide as IPS which could present an issue to some. The stand is also pretty limited on that model (unlike your U2311H).

Personally i cant see any reason to swap out your U2311H unless you have a problem with it or unless you need a second screen. If you're looking only to see improvements from moving from 6-bit+FRC to 8-bit colour depth i think you will be disappointed as you probably won't see any
 
i also wouldn't get at all hung up on the 6-bit + FRC vs 8-bit colour depth question here, as in reality you are probably never going to see a difference in pratice. modern FRC techniques are very efficient and unless you were using the screen for high end graphics work, working with a lot of gradients for instance (in which case you probably would be looking at a higher end screen anyway) you just aren't going to see an issue with its use. It's a bit of an irrelevant point really that some people get hung up on unnecessarily.

...If you're looking only to see improvements from moving from 6-bit+FRC to 8-bit colour depth i think you will be disappointed as you probably won't see any

I would like to second this.

xdantespardax -

The difference in practice, unless you are doing fine colour manipulation, is very limited and difficult to really see. IPS panels such as the U2311H have a very fine dithering algorithm and visibly produce an excellent range of colours and do so accurately. If you are after more 'realism' or 'pizazz' this would come from things such as an extended colour gamut in applications where this can be used correctly or a glossy screen surface for some extra 'punch' to the image. The whole '6-bit' vs. '8-bit' has been totally blown out of proportion and as monitor reviewers Baddass and I have seen first hand how little difference it actually makes. When it was first discovered that some modern IPS panels were 6-bit with an FRC stage on top people were shocked and literally couldn't believe what they were reading. That is testament to how effective modern dithering can be.
 
Last edited:
Hey guys, thx for the replies. That's very interesting what you're saying and I feel a bit better now about my Dell! :) Yes I too was totally shocked when I read about the 6-bit colour on this screen because I thought the colour was so good. What annoyed me though was how it was not revealed by either LG or Dell at the time, only like a year or so after it's release, very misleading. Fair enough the quality is very good, but it shows how worried they were about hurting sales. But I must admit the colours are pretty rich in games like Mass Effect etc.

But the thing is, I have seen games running on the Xbox on a large LG 720p LCD-TV (an LS2000 I think, one of the older pre-LED models), and the colours/lighting look just a bit more realistic/vibrant than my Dell. Bright daylight scenes and sunsets etc. seem better somehow, but could that be because of a higher brightness level 500cd/2 versus 300cd/2 on my Dell? Just curious. And I wondered whether the older lcd-tvs are 6-bit or 8-bit? I can't seem to find info on it. I think the LG has an H-IPS panel, which presumably is full 8-bit colour? So if so, they obviously managed to get 8-bit colour on an IPS screen, with no lag or slowdown, albeit at 720p. So why can't they for pc monitors?

I know they aren't available now, so what about the newer LED tv's, what colour depth are they? You say there's virtually no difference, but I have seen (or imagined) a noticeable slight difference.

The only other thing I can think of that could cause a difference in the appearance of colours is the anti-glare coating. Maybe the LG screen has a slightly less obscuring/dulling effect?

Regarding the PWM, I suppose if it was that bad they wouldn't be able to sell any at all, but I have read on forums people getting eyestrain and having to return the screens so it sounds worse for computer use as opposed to watching tv or playing games. Perhaps because you are closer and reading on a pc, you need a steadier image.

I suppose at the end of the day I have to try these out for myself to weigh up the pros and cons. I just need to prepare for my bank balance to suffer accordingly lol!
 
Last edited:
well its very hard to compare a desktop monitor to an LCD TV really, you could be seeing several different things there. It's very unlikely to be related to the colour depth though as i really don't think you'd spot a difference between a decent 6-bit+FRC panel and an 8-bit panel.

More likely, and as PCM2 alluded to earlier, you might be noticing differences in colour gamut through differences in the backlight. I'm not sure what backlighting your TV uses, but there's a chance it's a wider gamut unit. Wide gamut would offer you the most noticeable difference in reality compared with your standard gamut Dell screen. the difference in the brightness setting will certainly impact the perceived image as well and youre more likely to have a TV set higher than a desktop monitor which is only a few inches from your eyes!


Remember also that TV's are designed to offer bright, colourful and vivid images but not necessarily realistic colours. there's often loads of enhancements and tweaks done in the TV to improve the image for TV and movies, but those are left off desktop monitors as they arent appropriate and people don't want them for normal office use and the likes. Colours may look more vivid and brighter on the TV but that doesn't mean they are accurate or reliable or realistic. if you wanted to make your desktop more like the TV then you could simply play with the digital vibrance setting of your graphics card to boost the colour appearance artificially. again, getting a wider gamut display would also change the appearance of colours and make them look more saturated, but again thats not necessarily realistic. it's all about personal taste though :)
 
Back
Top Bottom