The state of the monitor and TV market

Soldato
Joined
10 Jul 2008
Posts
8,301
We're at a point in time where it feels pointless buying a new TV that's not an OLED, unless you have a specific use case which means you will use it in a particularly bright room most of the time, or you will spend a lot of time on a certain channel or game which would cause image retention/burn issues.

Yet, not everyone wants a 55 inch OLED TV. Some want smaller for a bedroom or wherever. Obviously cost is still an issue as manufacturers want to drip feed the technology to make money for many years.

Do you see OLED sizes coming down to where they can be used on desks as monitors or is OLED flawed for desktop use due to retention and burn issues when using apps / windows task bar etc.

It seems for the last decade plus, we've had to live with really poor quality control in panel technology. Dead pixels are less of a thing these days, but it seems almost expected now to get bleed and glow issues with IPS. Every technology we have is a compromise between the main types of TN, VA and IPS. Even OLED due to the possible issues one could encounter. Buying a 4K monitor now means you have to spend large amounts on a graphics card if you want to be able to game at native res. Ultra wide monitors are ridiculously expensive since they are more niche and less economy of scale applies.

The above is why I haven't replaced the monitor I use at home now, which I bought 10 years ago. A basic Dell TN panel 16:10 1920 x 1200. I just don't see the point, nor have the motivation to spend hundreds on what will be a compromise. This is a sad state of affairs in 2019.

I am also using a 46inch TV which I consider one of the better LCD ones from again about 10 years ago! It feels like we've gone backwards in image quality from Plasma days. Response times and refresh rates are another matter though.

Do you agree? What is the future? Will 21:9 ultra wide monitors start to take off more for productivity bias users? Will 4K become as adopted as 1080p has been?
 
The point, for me, for upgrading from 1080p monitors to 4k was coding work, my 32" 4K screen is amazing for that and I wouldn't ever want to go back. It's beautiful, and I'll easily get a few more years out of it.

Talk to me goose...

Been weighing up my options as I work from home a lot and also code. I've generally been putting off full 4K screens because I do still want to game occasionally and just don't have the money for a card that can push full 4K right now. Even if I did, I cannot justify it with the state of the GPU market...could make that thread as well actually ;)
So I looked at Ultrawides and I love the idea of them but a couple of things made me question whether I would be doing the right thing.

1: The curve - some people have said the curve affects their work in Photoshop/editing for the worse when working with lines, rotating, cropping etc. I know you can get them without a curve, but I think it helps a lot with making them more usable.
2: The cost.

I probably want the screen suitability bias to be something like 20% gaming, 20% photo/video editing, 60% productivity/coding. Obviously IPS is good for working with colours. They seem to have good response times these days.
I'd also love a built in KVM so I can simplify my setup.

Then I looked at the possibility of turning my current TN panel to portrait for coding (did that before at one job and liked it) and then just buying another "MAIN" monitor which wouldn't have to be widescreen and therefore would be the cheaper option.

Do you not find 4K too small on the old eyes? I have a full HD laptop 1920 x 1080 screen which is at about my limit but this is a very small 13.3 inch screen so high pixel density.
I also sit very close to my screen due to limitations of desk size. I'm about 2.5 feet from my screen. I can't imagine having much bigger than 27 inch without having to physically move my head too much.
 
Back
Top Bottom