As hard as it is to believe, most people, regular viewers and such - are detail blind. True story. Most people don't see obvious things, like wrong aspect ratios, low bandwidth issues and pixelisation of terrestrial digital channels let alone more "settle" problems like bad standards conversions, ghosting, dropouts etc which drive a lot of us literally nuts. Most of those people watch Freeview ITV pixelotto most of the day, and when they finally switch channels to a very average, rather badly converted Sky movie channel they think it's as good as it gets. And in all honesty, on a rerun channel, going through cheapest Sky box to a budget line, few years old korean laggy LCD set to incorrectly letterbox and stretch 16:9 content on a "HD ready" screen, it probably actually is. As good as it gets for them.
I personally hate the old, interlaced world of SD TV and DVD. It took about 3 years for DVD standards to drop like a brick through surface of the water and as the prices plummeted any good quality film to DVD conversions became simply too expensive. And quality dropped so much that sometimes you can still find better VHS tape transfers than DVDs. Especially the "repackaged" kind - n-th grade "masters" on betacams converted from film to NTSC, to PAL, to NTSC again then mastered in china for 3in1 Lidl release.
That said however, the standard of HD was absolutely flawed from the very start and I think everyone knew this was going to end badly. For starters, colour space in HD and mastering codecs that became standard were just wrong, plenty of issues there - ProRes and DNxHD are prone to colour banding/posterisation, majority of material in my industry comes from SLRs and cheap cameras recording in mpeg2, colour grading and black levels are generally just wrong in most cases, all the rest of intermediate codecs in use tend to be just glorified 30-50Mbps intra mpeg2. At consumer level h264 doesn't react very well to fast scene changes and grain, which is kind of ironic for a codec specifically chosen for fast footage with added grain.
But I suppose the worst of it all, was disconnection of broadcast HD vs consumer HD. Every error in a book was done at that point. While most production houses and film footage was being converted to HD in 23.98 and 24fps for cinematic feel and compliancy, broadcast world just found it too hard to drop the old habits and gone completely off the script - yanks picked up mpeg2 based HD, so they found interlaced 59.94 field standard to be the most suitable. We, in Europe picked h264 based HD, but while initially progressive 25 fps was favoured, with time more and more TVs on the continent started requesting 50 field interlaced HD stuff, which is just heart braking. In the end we all returned to the old facepalm worthy world where directors shoot in frame rates that aren't native to anything and everyone else just converts it, screwing up quality in the process.
And once again, as more and more amateurs get contracts for HD content, the more corners are cut and worse and worse the source material available to you - the viewer - becomes. On TV it's already everywhere - bad upscaling, badly deinterlaced then upscaled footage, repeated frames, judder, aliasing, a lot of standard converted stuff has crazy morphing in fast scenes (as if no one bothers to customise default settings on Snell & Willcox Alchemist anymore). BluRays, for the time being, are still relatively immune to that. There is still high production value in them. Standards to keep. Quality control is being made. But with prices going down and tempo rising, it slowly goes into speed vs quality trap. It starts, as always, with music videos and concerts. More and more releases are juddery, slightly less sharp. Overblown colours. Crazy amount of grain added to cover for poor quality footage. Then older movies. Conversions not as well done as the first big titles, colour grading shot, blue hair, olive skin tones etc. In few years it will just slip back into the usual messy, ugly malarky.