Associate
I can only speak from my personal experience, and my personal experience *on quality front* matched a lot more users reports (exactly issues reported that I've got) than "professional reviews".
This "only negative reviews gets mostly reported" looks just the same excuse mantras as others - "light bleed not noticeable under normal condition", "having glossy screen is better for contrast (in dark room, and cheaper for us to make)", etc
I do appreciate a lot the job tftcentral is doing though, this is one of my favourite sites. If not for quality indication for but at least for quantitative measures like response times, colour accuracy etc which absolutely can be trusted since they not vary that much per unit.
But I still have to see online reviews reporting LCD uniformity issues as bad as it they actually happen to occur (and yes, they tend to be really bad and actually under-reported instead of over-reported).
Also not saying that reviewers are deliberately skewed in manufacturers favour - but if manufacturer (or sometimes affiliated shop) *knows* sample goes to reviewer, they will absolutely make sure its good sample.
If sometimes reviewer gets random sample and can be sure that its same as random shop sample - but sometimes not, these should be appropriately noted. Otherwise it still damages credibility overall.
This "only negative reviews gets mostly reported" looks just the same excuse mantras as others - "light bleed not noticeable under normal condition", "having glossy screen is better for contrast (in dark room, and cheaper for us to make)", etc
I do appreciate a lot the job tftcentral is doing though, this is one of my favourite sites. If not for quality indication for but at least for quantitative measures like response times, colour accuracy etc which absolutely can be trusted since they not vary that much per unit.
But I still have to see online reviews reporting LCD uniformity issues as bad as it they actually happen to occur (and yes, they tend to be really bad and actually under-reported instead of over-reported).
Also not saying that reviewers are deliberately skewed in manufacturers favour - but if manufacturer (or sometimes affiliated shop) *knows* sample goes to reviewer, they will absolutely make sure its good sample.
If sometimes reviewer gets random sample and can be sure that its same as random shop sample - but sometimes not, these should be appropriately noted. Otherwise it still damages credibility overall.