Why have we jumped to 4K?

Permabanned
Joined
7 Oct 2018
Posts
2,172
Location
Behind Pluto
Still to this day I cannot see a visual difference between a high-end 1080P screen and a 4K one.
the only differences I found were from panel quality and the benefits of colour reproduction, not resolution.

Why are people so hell bent on not accepting a better superior panel instead of more pixels?

I sit at a distance from my TV of 10 feet and this is by no means a long distance. In stores I have to get really close to the TV to make the resolution seem like a jump visually whilst I can instantly see a quality panel from a crap one purely from colour reproduction and response time.

Why are we so accepting of crap technology that only really benefits professional work - loads not the end consumer especially us casuals?

Would it not make sense to push for the best panel quality? Raise prices based on this? rather than trying to push higher res?

This came into my head after putting down money for a 32 inch QHD monitor which will be used with games consoles for media purpose and the odd game or two (XBOX ONE X can push 1440P from the downsampled 4K output).

After looking around...

https://www.avforums.com/threads/is-there-any-point-in-4k.2251217/
 
Last edited:
Reading through this...

https://www.avsforum.com/threads/is-netflix-ultra-hd-on-non-4k-tv-pointless.3094766/

You most definitely will benefit from a UHD stream vs a HD one even on a 1080P screen, the far higher bit rate delivers even more visual data, you also gain audio chops and other features.
For video, the resolution is just a container, the data rate is the amount of data filling the container and most streams are still pretty crap with a lot of loss of quality, UHD streams at 1080P will look closer to BluRay than the standard HD stream will.

Up your container, the more data rate we need for a quality output, making 4K streaming just a bandwidth hog rather than a perceivable quality upgrade.

I am not sure there are any HD HDR panels, but if you do have one.. HDR too.
 
Last edited:
Still to this day I cannot see a visual difference between a high-end 1080P screen and a 4K one.
the only differences I found were from panel quality and the benefits of colour reproduction, not resolution.

Why are people so hell bent on not accepting a better superior panel instead of more pixels?

I sit at a distance from my TV of 10 feet and this is by no means a long distance. In stores I have to get really close to the TV to make the resolution seem like a jump visually whilst I can instantly see a quality panel from a crap one purely from colour reproduction and response time.

Why are we so accepting of crap technology that only really benefits professional work - loads not the end consumer especially us casuals?

Would it not make sense to push for the best panel quality? Raiuse prices based on this? rather than trying to push higher res?

This came into my head after putting down money for a 32 inch QHD monitor which will be used with games consoles for media purpose and the odd game or two (XBOX ONE X can push 1440P from the downsampled 4K output).

After looking around...

https://www.avforums.com/threads/is-there-any-point-in-4k.2251217/
Both are important. If you don't see a difference in the resolution from 1080p to 4K then you're either sitting too far away for the screen size or you need glasses.
 
Both are important. If you don't see a difference in the resolution from 1080p to 4K then you're either sitting too far away for the screen size or you need glasses.
I would need to sit so close that my knee's would be barely 1 foot apart from the TV stand and the sofa is literally hogging all floor space.

The gains you see in video games are due to the way games are rendered, resolution impact is a side product of this.
Higher resolution with gaming and some post prcoess AA and the jaggies are for the most part fixed.

I have 20/20 vision.
 
You will get mixed replies on this subject as some are very invested into the format and wont take nicely to you challenging it.

Personally I can tell the difference only when the picture has been stopped and you show a comparison side by side, then yeah you can see it. During an actual movie or something with the screen moving, no I can't really notice anything of any significance. Some types of films show it worse than others. HDR however makes a big difference in certain types of movies and scenes. Once your screen size gets massive though (above 65) you can more easily spot it, but its not something that would pull me out of a movie. I'm sure some 4K presentations are more impressive than others however and show of the format.

Once you switch to games however there is a big jump due to way things are rendered its 100% worth it on the gaming side of things.
 
Last edited:
You will get mixed replies on this subject as some are very invested into the format.

Personally I can tell the difference only when the picture has been stopped and you show a comparison side by side, then yeah you can see it. During an actual movie or something with the screen moving, no I can't really notice anything of any significance. Some times of films show it worse than others. HDR however makes a big difference in certain types of movies and scenes. Once your screen size gets massive though (above 65) you can more easily spot it, but its not something that would pull me out of a movie. I'm sure some 4K presentations are more impressive than others however and show of the format.

Once you switch to games however there is a big jump due to way things are rendered its 100% worth it on the gaming side of things.

I agree with you, I have personally took up 1440P as my go-to for now on, for my PC usage it is perfect and is a big upgrade from 1080P, because I sit up to the monitor like I think the majority of PC users do. Happens to be 170hz and also 0.5ms and also does way above average contrast for an IPS panel, yes resolution here makes a difference, but the accompanying benefits of a higher quality panel make this a bigger upgrade than what the resolution can deliver alone.

Now if I were to use this from around 5 feet away, at 27 inches I would not see a difference other than smaller stuff on screen when using 100% scaling rather than 125 or higher.

The TV makes no sense and is why I never moved from 1080P with it, plus you get older standards such as component which means hooking up that bad boy PS2 or OG XBOX, though you can now buy HDMI convertors.
I now have a PC that can emulate 100% most of what I play and is now slowly taking over the PS2 era stuff, I still need it for some games though.

Watch this video full screen.. even at 1440P on the 4K setting there is massive visual degradation yet people on 1080P think it is a massive upgrade, yes because their container resolution is now getting a far higher data rate.. the visual clarity and benefit is not from resolution here.



The live footage in the 2nd sample has over-sharpening applied to counter for the low bit rate which is obvious too.
 
Still to this day I cannot see a visual difference between a high-end 1080P screen and a 4K one.

If that is true then you need glasses, and I mean that genuinely. You should absolutely be able to resolve finer detail difference between 4K & 1080p on a normal TV at a normal viewing distance (e.g. 55" at ~1.6m away for mixed content).
Ideally the best way to test this is with video games because you can have the same content but scaled to different resolutions, though it would also require the TVs to be of different resolutions at the same size, which is rare to find today. But then especially the difference should be even more obvious because the pixel per inch disparity is monstrous (at 55" it's 40 PPI for 1080p, or 80 PPI for 4K; to put into monitor context it would take a 720p monitor of 36" or 800x600 25"!!! to have such low PPI).

I sit at a distance from my TV of 10 feet and this is by no means a long distance.

It is a very long distance, in fact, for TVs <75". And depending on the content even for 75" (i.e. not a big deal for close-ups of a person's face taking up most of the screen, but much more important for small details; compare how easy it is to resolve detail for the face vs the rocks in the image).

pRnXZsS.jpg


Would it not make sense to push for the best panel quality? Raise prices based on this? rather than trying to push higher res?
Who says both aren't being done? In reality the jump up in resolution incurs a very minor cost to production, that's why it's being done, and also why you see 8K starting to be pushed. Easy to do, and easy to market. The other things you mention are much more difficult to achieve and go into the realm of diminishing returns, absent actual new technologies as we see being pushed with QD-OLED for this year - but guess what, how happy will you be to pay $5000 for a 55"? Never mind the tech that's really being chased, microLED, which actually goes into the six-figures realm.

The reality is that every improvement that's noticeable and cheap to do has been done; for bigger leaps it's all up to the R&D Gods now, unless you want to pay many times more for a panel that's slightly better than one a tier or two below what's sold to the mass market - in which case look up Sony's Master series of TVs or Panasonic's highest end OLED model (usually ending in Z2000).
 
If that is true then you need glasses, and I mean that genuinely. You should absolutely be able to resolve finer detail difference between 4K & 1080p on a normal TV at a normal viewing distance (e.g. 55" at ~1.6m away for mixed content).
Ideally the best way to test this is with video games because you can have the same content but scaled to different resolutions, though it would also require the TVs to be of different resolutions at the same size, which is rare to find today. But then especially the difference should be even more obvious because the pixel per inch disparity is monstrous (at 55" it's 40 PPI for 1080p, or 80 PPI for 4K; to put into monitor context it would take a 720p monitor of 36" or 800x600 25"!!! to have such low PPI).



It is a very long distance, in fact, for TVs <75". And depending on the content even for 75" (i.e. not a big deal for close-ups of a person's face taking up most of the screen, but much more important for small details; compare how easy it is to resolve detail for the face vs the rocks in the image).

pRnXZsS.jpg



Who says both aren't being done? In reality the jump up in resolution incurs a very minor cost to production, that's why it's being done, and also why you see 8K starting to be pushed. Easy to do, and easy to market. The other things you mention are much more difficult to achieve and go into the realm of diminishing returns, absent actual new technologies as we see being pushed with QD-OLED for this year - but guess what, how happy will you be to pay $5000 for a 55"? Never mind the tech that's really being chased, microLED, which actually goes into the six-figures realm.

The reality is that every improvement that's noticeable and cheap to do has been done; for bigger leaps it's all up to the R&D Gods now, unless you want to pay many times more for a panel that's slightly better than one a tier or two below what's sold to the mass market - in which case look up Sony's Master series of TVs or Panasonic's highest end OLED model (usually ending in Z2000).
1.6 meters is just a little over 5 feet, I sit at 10 foot distance, 5 foot away from a TV is more like 2-3 foot space between sofa and your TV stand or wall which is no room at all, that's a smaller width than a tight corridor. Your ass is not on the edge of your sofa, it is sat back in it.. so add the sofa's girth...

Your math is incredibly bad.

Spacial awareness makes math work, practical application.

iu
 
Last edited:
Having a 4K OLED, and mostly playing 1080p I agree with OP. 4K really isn't much better once you simply watch the movie. Granted my eyes aren't great but when I go up closer it's not bad at all, and if you've got a good scaler 1080p looks fantastic at 4K (and 1080p pioneer kuro)

Most people with 4K TV's have smaller TV's anyway so it's pointless, I'd say 4K worth it at projector size.

Still if you're streaming 4K and they put low bitrate, it's going to look bad.
 
I have 20/20 vision.
Maybe you used to…

The downside to 4k is there’s more detail to see the rough edges around what would have been considered good quality VFX on 1080p content.

Good vs bad quality screens have always been a thing, even in the 720p days.

I remember playing with Pioneer Kuro screens.

Now content is a limiting factor… I can easily tell the difference between a Blu-ray and decent quality encode… let alone the poor quality of Netflix streams etc
 
Last edited:
I love 4k,The colours for me just pop better and the darks are on another level, If i watch anything lower than 1080p I see what I can only explain as "Patches" in blacks and it looks horrible (No setting changes fixes it), But watching 1080p BluRay and 4K content that issue is completely gone.
i am a bit of a snob when it comes to picture quality, I only buy Blurays minimum now, Unless what I want to watch is old and you cannot get it that high.

I have a Samsung QE55Q6FAM,Which today is probably getting on a bit now but its HDR10,4K,Quantum Dot...still very happy with it (except for the slow UI and mentioned "Patches" in blacks on lower quality vids. :)
 
Try watching say some of https://www.youtube.com/c/EnesYilmazer/videos at 4K and 1080p and tell me you don't see a massive difference.

There is a lot of media, especially streaming which even 720p would be being generous but there is a fair bit of decent 4K media out there as well.
If I lower it from 4K to anything lower I just see less data being displayed to my 1440P monitor.

At 4K there is colour banding and many blocking artifacts. I clicked his latest video the "Touring a $48,000,000 LA Hillside Mansion with the BEST VIEWS OF LOS ANGELES"

And lets get real here if this is the content we have to watch to get even close to good 4K then.. it's a joke. This barely just fills out 1440P, this must look much worse on native 4K.

Fast moving content will be even worse too.


Untitled.png


The colour banding may not be noticeable to all, not everyone has a high colour gamut display.
 
On my 43" QD-LED 4K display the difference is noticeable vs 1440p not earth shattering but the image is sharper and has more punch - vs 1080p it is quite significant.

I have a 1440p gaming monitor beside my 4K monitor and yes 4K on YT does have more detail than the 1440p setting on it - but the 4K monitor shows more detail in areas of the picture again.
 
Last edited:
If I lower it from 4K to anything lower I just see less data being displayed to my 1440P monitor.

At 4K there is colour banding and many blocking artifacts. I clicked his latest video the "Touring a $48,000,000 LA Hillside Mansion with the BEST VIEWS OF LOS ANGELES"

And lets get real here if this is the content we have to watch to get even close to good 4K then.. it's a joke. This barely just fills out 1440P, this must look much worse on native 4K.

Fast moving content will be even worse too.


Untitled.png


The colour banding may not be noticeable to all, not everyone has a high colour gamut display.
4K isn't easily divisible into 1440p so it will have to do "rounding" when it downscales to 1440p, and I'm pretty sure Youtube's video player won't be using a great downscaling algorithm.
 
4K isn't easily divisible into 1440p so it will have to do "rounding" when it downscales to 1440p, and I'm pretty sure Youtube's video player won't be using a great downscaling algorithm.
There is no downscaling happening, the video just plays at a certain data rate, it has no hand-shake which HDCP complient devices require which is properiety to Sony and BluRay content.
It just sets to a container resolution then applies more data per resolution container on YouTube, you are not inherently getting any different picture in any shape or form.

The processing that goes on is merely the data that is being fed into that container.
Inherently a larger container requires more data to fill out and present it's true capability.

A 1440P display will be better filled out natively than a 4K one and this trickles down to 1080P where you get superior image quality and it is noticeable regardless of the container size.

This is not the same as a video game being downsampled as you are rendering the output regardless then squashing it down or not.
 
There is no downscaling happening, the video just plays at a certain data rate, it has no hand-shake which HDCP complient devices require which is properiety to Sony and BluRay content.
It just sets to a container resolution then applies more data per resolution container on YouTube, you are not inherently getting any different picture in any shape or form.

The processing that goes on is merely the data that is being fed into that container.
Inherently a larger container requires more data to fill out and present it's true capability.

A 1440P display will be better filled out natively than a 4K one and this trickles down to 1080P where you get superior image quality and it is noticeable regardless of the container size.

This is not the same as a video game being downsampled as you are rendering the output regardless then squashing it down or not.
If the selected quality doesn't match your monitors resolution then it will be scaling. Even if you select the 1440p quality option there will still be downscaling since the video was shot natively in 4K, only it will be done by Youtube's encoder instead of your computer. If you get better quality from selecting the higher resolution then it's because Youtube's encoding used worse compression for the lower resolutions.
 
Back
Top Bottom