Still a QC lottery in Jan '17?

Acer can go and do one. I bought my ex girlfriend one of those lovely ultrabooks when they first came out, and later the ribbon from the mobo to the monitor broke, so I go to get a spare, model discontinued, was just out of warranty etc. no spares, tried eBay for ages, googling the part, the works. So she is left with a lump of useless electronics which was a present from me. yey jump off a cliff please Zcer

In hindsight I would probably search for a busted one for spares but i've left her now. Still makes me wary of buying anything from them ever again.

Jesus you broke up over it! That is bad!
 
just a sign of modern times, quality and repairability are out...shifting as much crap as possible and telling people to buy a new one when it breaks is the new system. There must be at least one brand that still takes pride in shipping well made monitors.
 
This was one of mine that went back. The light is on in this room, the camera's just a bit insensitive. There are no reflections in the screen.

I do a lot of graphics work, so having ~20% of the panel discoloured like that was taking the mick. Reading a couple of reviews after the fact, I found it's a model known for BLB. A £550 "professional" screen that is KNOWN TO HAVE BACKLIGHT BLEED... They couldn't even find a sample to send to the review sites that didn't have it!

Basically that right there is what has gotten me so fed up with the market. Why was it ever on sale if that is the typical state of them? :(

Does it look worse with the camera though? If it looks like that with the eye then I can see why you are annoyed.
 
Does it look worse with the camera though? If it looks like that with the eye then I can see why you are annoyed.

Slightly, but not much. The green tint and loss of contrast in the brick textures was very clear when sitting an arm's length in front of the screen - i.e. normal operating position. Not what I would have called professional grade colour accuracy!

My biggest annoyance was their inability to even send a sample to review sites that didn't have BLB. Why is it even on sale if they're all like this? The fact that manufacturers (Iiyama in this instance) are happy to put up sub-par bits of tat like that and think the world will pay for them anyway is... not good. Tarnished their name in my eyes now, I'm unlikely to risk them again.
 
Last edited:
Tbh, I don't mind paying £800 for a good 30" or larger panel - but it has to have good quality control. You can't sell me something at that price, then say the glows in the corners are normal and "how the technology works". It isn't, because my 24" monitors say otherwise. If the technology doesn't scale, then the technology sucks and changes are required.

I'll keep an eye on the VA screens. Maybe they'll actually be prepared to address the BLB issues if it's a whole new panel anyway! Surely there must be enough returns going on for the manufacturers to actually want to solve the QC issues? :/

that is how the tech works though. The bigger the screen the worse the bleed.

Unless the screen has FALD (local arrays of LEDS which can be dimmed. However TV's with FALD start at around £2K. they aren't cheap at all.

VA panels are regarded as better than IPS within the TV market.

I however opted for an IPS tv due to viewing angles however with a monitor viewing angles shouldn't be ab issue. I suggest looking at a 4K tv from the likes of Samsung, Sony or LG. They start at 40" and you can send them back for another if bleed is bad.
 
that is how the tech works though. The bigger the screen the worse the bleed.

It's not really how the tech works, it's an effect of how they're being made, surely?

CCFL backlighting ran hot, but it wasn't all condensed around the edges. LEDs around the edge however - oh, suddenly any imperfections at the edges show up. So why are we still using edgelighting when it's conspicuously pants? :/

IMHO if the technology sucks, then it should be replaced, not excused and perpetuated.
 
It's not really how the tech works, it's an effect of how they're being made, surely?

CCFL backlighting ran hot, but it wasn't all condensed around the edges. LEDs around the edge however - oh, suddenly any imperfections at the edges show up. So why are we still using edgelighting when it's conspicuously pants? :/

IMHO if the technology sucks, then it should be replaced, not excused and perpetuated.

That is where FALD comes in. But it's very very expensive.

The color palette is wider, blacks are deeper, and whites are brighter. Full Array Local Dimming (FALD) Refers to an LED TV's backlighting system. A FALD display contains an array of LEDs spread out in a grid behind an LCD panel, rather than just at the edges of the TV.

So the bigger you go the more sense FALD makes but it ain't cheap. It's probably due to increased manufacturing costs and quality checks needed.
 
Actually LEDs are dirt cheap nowadays and making FALD is not much more expensive than led strips.

It just that amount of penny pinching in modern manufacturing reached extreme - mainly because everybody tries to beat competitor in price (even if its just slightly less).
 
It just that amount of penny pinching in modern manufacturing reached extreme - mainly because everybody tries to beat competitor in price (even if its just slightly less).

This is the sort of thing that gives rise to using flickering backlights to reduce brightness because it's 10p cheaper per unit :/

But if FALD isn't that much more expensive, it seems a real shame that the premium makers - Dell, NEC and the like - aren't using it. If they were prepared to promise BLB-free screens, I'm sure that they'd increase sales...
 
This is the sort of thing that gives rise to using flickering backlights to reduce brightness because it's 10p cheaper per unit :/

But if FALD isn't that much more expensive, it seems a real shame that the premium makers - Dell, NEC and the like - aren't using it. If they were prepared to promise BLB-free screens, I'm sure that they'd increase sales...

I heard the difference was 50p, so let's not EXAGGERATE! :D
(actually, even the 50p difference might have been between 150Hz PWM and 3kHz PWM -- though that doesn't imply that PWM-free solution would be that much more expensive, either)

But in all seriousness, from the business perspective:
When the economies of scale come into play, there's actually a big profit to be made if you can squeeze even 10p from one area, and 50p from another, and so forth (*). But the benefit only stands if all of the manufacturers are on board with skimping it out, which effectively forms a cartel (which is illegal). But it takes only one manufacturer to use the better solution and market it correctly, which ends up them taking the customers to themselves, while the cartel either tries to drive the non-cartel competitor out, or follows their example, when they notice that the cartel has crumbled.

(*): In economics studies, one famous example they teach of skimping out is the big airline company which noticed that by taking out just the peas from all their long distance flight dinners, they saved millions per year. But if you skimp out too much (take away the steak?), then that risks leaving a bad taste (;)) to the customers, and losing them to the competitors.

Furthermore:
In case of LED PWM, it might indeed be that the manufacturers genuinely thought that it wouldn't cause any problems, because CRTs were mostly operating at 60Hz, and the LED PWM was around 150Hz. They just didn't factor in the "cooling time", or lack of it. And it's not like PWM is totally useless. For one, it enables a wider control range for brightness, and from what I've understood, allows a lower power draw, as well.

It's just a shame that manufacturers took so long to accept the fact that the compromises were so detrimental that people were even prepared to pay extra for a PWM-free solution. But fortunately, since 2015, it's now more common for new monitors to have a flicker-free backlight than not.

So next in line (my own wishlist):
- scrap the integrated speakers, save some costs
- scrap the TN panels (at least the 60Hz models!)
- include FreeSync/Adaptive-Sync in all monitors
- 100Hz+ for all monitors (this won't happen for a long time)

As for FALD, the extra cost might not be the only reason manufacturers are keeping their distance. From what I've understood, FALD also brings some extra chassis thickness, and some manufacturers seem to have some sort of fixation on avoiding that one. I wonder how long it takes for them to figure out that most people don't actually care that much for an even slimmer chassis. Or that they certainly won't pay much extra for it. IMO, the transition from CRT to LCD already brought enough slimness. Anything after that is just fluff. (OTOH, I do like the slim bezels... - though not enough to compromise on the important features)
 
Last edited:
I believe new Asus HDR monitor supposed to finally use FALD, not sure who's panel its going to use.

That would be nice to see, might kick the whole lot of them into gear if it works out well :)


Good points - let us hope the unofficial cartel comes to an end soon ;)

Hopefully edge lighting vs FALD can become the new PWM that people are prepared to pay £1 more for, and they'll switch to it in a year or two. And yeah, I have no particular interest in the thickness of the screen when it's a desktop unit. It'll sit with its back to the wall for the next 3 years and I'll never think about it again. My current Dells are pretty chunky, and that's fine.

Bezels I would notice - but I feel like FALD is better suited to a thin bezel anyway. Maybe *that* will actually be what drives them onto it in the end...
 
My old sony 46 inch tv (X4500) had FALD and that was years ago! Only replaced it because i bought a 55 inch oled.

Surely it cant be that hard to put FALD in a pc monitor?

Shame oled isn't really ideal for desktop pc use :(
 
Last edited:
Back
Top Bottom