Why have we jumped to 4K?

If the selected quality doesn't match your monitors resolution then it will be scaling. Even if you select the 1440p quality option there will still be downscaling since the video was shot natively in 4K, only it will be done by Youtube's encoder instead of your computer. If you get better quality from selecting the higher resolution then it's because Youtube's encoding used worse compression for the lower resolutions.
Native for video is data rate compliant, it is impossible to truly output RAW 4K video, we still don't truly have the hardware at least cheaply to even watch RAW 4K.
Youtube's 8K setting is closer to what you would see in a BluRay but still off by a country mile.

Even BluRays are encoded from RAW, it's just not feesible at all in any case to see RAW output, heck HD 1080P is still a lot RAW. It's why we still don't even have content over air that looks good at 1080P until we hit the 4K era and now 4K is filling out 1080P, we are back to crap quality but now it is at 4K instead of which most of us do not benefit from. Most being 99.9% of us.

There is no downsampling happening.
My monitor is 1440P native. If I set the video to 4K, I just see a higher Data Rate being displayed which is the limit of what YouTube does for 4K video with it's encoder, the final output is limited by your monitor, you can have an 8K screen and you will just see crap 4K encoded YouTube video in a larger container in 8K, it is technically upscaling but it definitely is not at all, it's just that the data rate of the video is nowhere close to high enough to look good at 8K as 8K needs so so so much more data.
 
Native for video is data rate compliant, it is impossible to truly output RAW 4K video, we still don't truly have the hardware at least cheaply to even watch RAW 4K.
Youtube's 8K setting is closer to what you would see in a BluRay but still off by a country mile.

Even BluRays are encoded from RAW, it's just not feesible at all in any case to see RAW output, heck HD 1080P is still a lot RAW. It's why we still don't even have content over air that looks good at 1080P until we hit the 4K era and now 4K is filling out 1080P, we are back to crap quality but now it is at 4K instead of which most of us do not benefit from. Most being 99.9% of us.

There is no downsampling happening.
My monitor is 1440P native. If I set the video to 4K, I just see a higher Data Rate being displayed which is the limit of what YouTube does for 4K video with it's encoder, the final output is limited by your monitor, you can have an 8K screen and you will just see crap 4K encoded YouTube video in a larger container in 8K, it is technically upscaling but it definitely is not at all, it's just that the data rate of the video is nowhere close to high enough to look good at 8K as 8K needs so so so much more data.
You're confusing compression with scaling. There's no way to view the whole 4k image on your 1440p screen without downscaling, and as 4K and 1440p aren't easily divisible it will likely reduce the image quality.
 
You're confusing compression with scaling. There's no way to view the whole 4k image on your 1440p screen without downscaling, and as 4K and 1440p aren't easily divisible it will likely reduce the image quality.
Your answer is an answer for video gaming not video.
Yes 1080P fits nicely into 4K!
But video is completely limited by how it is mastered, a video game is not, a video game is given a set resolution (Console) And told to render at that res... so for a console dividing 1080P x4 makes it look better!
Video does not do this, video is far more fluid in how it works.
Data Rate is like water filling a container.. the more water, the more full it is.
Video encoding technology is also highly intelligent and will adjust to the scene, using less data to get a similar image in scenes that really don't require it. Very efficient.
 
Why are we so accepting of crap technology
Sadly, the average non-techie consumer wouldn't recognise a good TV picture even if it jumped up and bit them on the backside. What they do understand is bigger numbers. 100W is bigger than 50W, and 500W is bigger than 100W. When it comes to TVs it's screen size and pixel count that are easier concepts to grasp than dynamic range and colour depth.

The industry could have gone for HDR and WCG in 1080p displays except for one problem. 1080p HD in the consumer TV world is established based on 8-bit colour. There's not enough bit depth to support the extra colours, and without that there's limited benefit in applying the gamma curves to try to make 8-bit do HDR.

UHD resolution brought with it support for 10-bit and 12-bit colour, and without that then there's no HDR and no WCG.
 
I don't particularly care for 4k either, although I can 'see' the resolution increase, 1080p is still plenty good enough.

If I could get every source on my OLED at 1080p Dolby Vision and good bitrates, I would take that in a heartbeat over some of the shoddy 4k quality content.
 
Sadly, the average non-techie consumer wouldn't recognise a good TV picture even if it jumped up and bit them on the backside. What they do understand is bigger numbers. 100W is bigger than 50W, and 500W is bigger than 100W. When it comes to TVs it's screen size and pixel count that are easier concepts to grasp than dynamic range and colour depth.

The industry could have gone for HDR and WCG in 1080p displays except for one problem. 1080p HD in the consumer TV world is established based on 8-bit colour. There's not enough bit depth to support the extra colours, and without that there's limited benefit in applying the gamma curves to try to make 8-bit do HDR.

UHD resolution brought with it support for 10-bit and 12-bit colour, and without that then there's no HDR and no WCG.
Which is sadly why most people that see an accurate or near accurate TV calibration will complain that the image is dull and the whites look too yellow and that colours don’t pop.
 
Native for video is data rate compliant, it is impossible to truly output RAW 4K video, we still don't truly have the hardware at least cheaply to even watch RAW 4K.
Youtube's 8K setting is closer to what you would see in a BluRay but still off by a country mile.

Even BluRays are encoded from RAW, it's just not feesible at all in any case to see RAW output, heck HD 1080P is still a lot RAW. It's why we still don't even have content over air that looks good at 1080P until we hit the 4K era and now 4K is filling out 1080P, we are back to crap quality but now it is at 4K instead of which most of us do not benefit from. Most being 99.9% of us.

There is no downsampling happening.
My monitor is 1440P native. If I set the video to 4K, I just see a higher Data Rate being displayed which is the limit of what YouTube does for 4K video with it's encoder, the final output is limited by your monitor, you can have an 8K screen and you will just see crap 4K encoded YouTube video in a larger container in 8K, it is technically upscaling but it definitely is not at all, it's just that the data rate of the video is nowhere close to high enough to look good at 8K as 8K needs so so so much more data.

4K RAW isn't quite that bad - for editing you need fairly substantial power and you won't be playing 4K RAW on a tablet or media stick, etc. but for playback any reasonable spec current gen PC can handle it (depending a bit on colour depth and frame rate) and even my ageing 2013 Xeon based system can (albeit it is built for that kind of stuff).

But there are also "lossless" or very close to lossless codecs which can reproduce 1:1 or very close to the RAW image - not all compression algorithms/profiles are lossy.

While you are right in that the video stream is closer to a description of the final image, using key-frames and so on, than just a store of RGB data there is still an element of downscaling going on - one method video encoders use is to sacrificing detail (often colour resolution) in areas that "should" be less noticeable to preserve detail in areas which are more detailed so even when the bitrate is way below what it needs to cover a given resolution in high detail you can still have elements of the scene which are at a much higher level of detail than the actual resolution a given bitrate would fill at native resolution.
 
If you won't take it from me, take it from a pro.
https://filmora.wondershare.com/video-editing-tips/what-is-video-bitrate.html

8K video... look how stunning it looks!

Sexy ASF on 1440P!



Buys 8K screen, sit's 50mm from the screen.. macroblocking everywhere!



RIP all content you are watching... you literally gain nothing from it at any normal viewing distances.
As I said before, you're confusing compression with scaling (resizing). https://en.wikipedia.org/wiki/Image_scaling

Video content on Youtube is really not a great way to compare 4K and 1440p resolutions. If you perceive better quality by selecting a resolution higher than your 1440p monitor, it's because Youtube encoded the higher resolution video with better quality settings. It's also possible that the downscaling from 4K to 1440p creates artifacts that give it an artificial sharpness, which could be perceived to be better quality at first glance but is actually worse.

I've owned a 27" 1440p monitor and a 28" 4K monitor, the difference is significant for video content and gaming. It's night and day for general desktop use, especially text clarity. You would get an even greater benefit upgrading from a 32" 1440p monitor with a 92 PPI (Pixel Per Inch).
 
I run a 27" 1440P 170hz monitor and no it would not be a significant upgrade at all as I do no real critical work, but I enjoy the panel technology as I enjoy visual quality.
An RTX 3070 is also not a great card for 4K.

https://www.rtings.com/monitor/reviews/gigabyte/m27q

The other 32" monitor is VA and is for distant use not sat right at it hence why I chose VA for deeper blacks, it won't be great at pixel response.

The U32E2N isn’t a model I’ve received much direct feedback on, but it essentially uses the flat version of the panel used in the Philips 328E1CA we’ve reviewed and recommend. I can’t argue with the price of the U32E2N and I’ve actually been keen to review it, but unfortunately it isn’t currently available in the US where the bulk of users who support our work are located. I rate that panel above most other competing VA models of the resolution which use an Innolux panel that I consider inferior. AOC tends to calibrate their monitors pretty well and give important adjustments including gamma settings. So I’m sure it’s a good product. The screen surface is less grainy than the BOE WQHD panel as well.

If you consider your work to be colour-critical or this is a key focus or concern for you, I’d generally recommend sticking to IPS-type panels. But the U32E2N will be a good choice as far as VA models go and if it’s mainly for mixed usage and you don’t really consider colour consistency of vital importance then you may find it absolutely fine. And as a final point, please avoid using the term ‘2K’ to describe the WQHD or 1440p resolution. Especially when used in the context of ‘4K’ (which is the intended context), it undersells the resolution and is misleading and inaccurate. Nothing personal, I point this out to everyone who uses the term on the forum.

https://forum.pcmonitors.info/topic/would-you-recommend-the-aoc-u32e2n-or-the-q32e2n/
 
I can notice a difference between 1080p and 4k on my oled. Sitting around 2m away. It's noticeable in movies and sport. Hugely noticeable BT Sport Vs BT sport ultimate with the football.
BT Sport is not 1080P and Ultimate is also not 4K, you just get the container, not the data rate, if you are used to crap visuals then anything is an upgrade and ultimate is "4K".

Almost forgot to say, there is zero harm in this either as it don't effect me and you are getting nice visuals for your money, be happy is what I say ;)
 
Still to this day I cannot see a visual difference between a high-end 1080P screen and a 4K one.
the only differences I found were from panel quality and the benefits of colour reproduction, not resolution.

Why are people so hell bent on not accepting a better superior panel instead of more pixels?

I sit at a distance from my TV of 10 feet and this is by no means a long distance. In stores I have to get really close to the TV to make the resolution seem like a jump visually whilst I can instantly see a quality panel from a crap one purely from colour reproduction and response time.

Why are we so accepting of crap technology that only really benefits professional work - loads not the end consumer especially us casuals?

Would it not make sense to push for the best panel quality? Raise prices based on this? rather than trying to push higher res?

This came into my head after putting down money for a 32 inch QHD monitor which will be used with games consoles for media purpose and the odd game or two (XBOX ONE X can push 1440P from the downsampled 4K output).

After looking around...

https://www.avforums.com/threads/is-there-any-point-in-4k.2251217/
uhd footy vs hd footy. It is no contest. It is completely apparent imo.
 
uhd footy vs hd footy. It is no contest. It is completely apparent imo.
Correct because the data rate on 4K is higher but you also gain HDR.

I mean it's not at all because it is UHD but purely because you are being given a better quality data rate to the image with the HDR option to boot which is a panel feature not resolution.
It is apparent, but when we break it down you are not gaining from 4K.

HD and UHD format both have a crap tonne of macroblocking but when viewed at a normal distance, because the HD stream has a data rate so sub par 4K looks like a night and day difference.

The difference between a 1080P BluRay and a 4K BluRay is mostly colour grading and dependant on how the films were mastered to the discs.

A top quality 1080P to a top quality 4K BluRay will show no major advantage other than far better colours and HDR plus the factt that if you sit close enough you won't see pixels on the 4K screen which you may want to ask your 4 year old if they want square eyes by the time they are 30. ;)
 
uhd footy vs hd footy. It is no contest. It is completely apparent imo.

The question is how much of that is apparent on a 4K display vs 1440p. With the **** quality of a lot of media it isn't as huge as it should be especially at longer viewing distances - but there are definitely plenty of advantages to 4K and the difference is easily apparent in many cases.
 
Correct because the data rate on 4K is higher but you also gain HDR.

I mean it's not at all because it is UHD but purely because you are being given a better quality data rate to the image with the HDR option to boot which is a panel feature not resolution.
It is apprent, but when we break it down you are not gaining from 4K.
UHD F1 vs HD F1 then :) both SDR :)

The question is how much of that is apparent on a 4K display vs 1440p. With the **** quality of a lot of media it isn't as huge as it should be especially at longer viewing distances - but there are definitely plenty of advantages to 4K and the difference is easily apparent in many cases.
That might be difficult to distinguish. But yeah I get your point.
 
I've owned a 27" 1440p monitor and a 28" 4K monitor, the difference is significant for video content and gaming. It's night and day for general desktop use, especially text clarity. You would get an even greater benefit upgrading from a 32" 1440p monitor with a 92 PPI (Pixel Per Inch).

Agree. I have experienced a few monitors/tv's for both work and home and my 4k monitor albeit not high spec makes watching regular 1080p noticeable in quality difference. The further away you are makes it less of a problem so if you sit metres away from a display and if you rank value high there is no point in splashing that extra out.

The question is how much of that is apparent on a 4K display vs 1440p. With the **** quality of a lot of media it isn't as huge as it should be especially at longer viewing distances - but there are definitely plenty of advantages to 4K and the difference is easily apparent in many cases.

Agree on this definitely been my take on it for a while now.

I run a 27" 1440P 170hz monitor and no it would not be a significant upgrade at all as I do no real critical work, but I enjoy the panel technology as I enjoy visual quality.
An RTX 3070 is also not a great card for 4K.

*Cough* @TNA :p

Why is this thread full of Youtube videos complaining about 4k? Youtube a known provider of utter trash compression across all resolution ranges.

Agree I thought this was a given.
 
Right...

Go find any evidence for the reverse of what I said and we shall all wait.

*Calmly chewing intensifies*.

Maybe this guy can help you.. oh it's on my side.. damn.






In the space of a short time and you speaking from your butt on things you have no clue about.. I found videos.
I can find articles in my favor far more easier with real evidence than you can in favor of 4K.
 
Last edited:
Back
Top Bottom