No idea to be honest, but the problem is real - its not just "horror stories", and I am quite saddened & puzzled by it myself.
I have Dell U2412M on my desk which was made 5 years ago but has picture quality miles better than "cutting edge" today's displays which cost 2x times more (even if you account inflation). Back then when I was picking it, I've tried 3 diff models and they all were very good, so picking one was hard choice.
Recently I thought about upgrading and again went through several already,- but old Dell still on my desk.
And worst of all, there appear to be no way to actually buy a proper "premium quality" display, you know - one which actually *guarantees* premium quality,- even if you willing to pay more, the only choice is mediocrity, or rather "randomness"... Even "pro-art" displays nowadays has issues, just slightly less - but they generally unsuitable for gaming since have high input lag, so no go even if you willing to shell 1K+ for features you don't usually need (like wide gamut).
I understand that it maybe hard to sell enough monitors if they just throw away panels with defects that some people are willing to accept.
But why not give people a choice? Even for same model they could just honestly bin them "A+ grade" which will cost big premium, and "B grade" which will be affordable - and people won't have to waste time & nerves (and ones of retailer's too).
It definitely worked with chip manufacturers for ages. For example, Intel makes same core crystals and thens bin them - best ones are clocked high and cost premiums, ones of lower quality are clocked lower and are cheaper. But if you buy 4Ghz CPU, you bet it will work perfectly on 4Ghz. Its not like "4Ghz *but may be down to 3.5Ghz if you are unlucky".