4K TV - 10ft away?

Soldato
Joined
29 May 2006
Posts
5,353
wow just wow. the picking different things on netflix doesn't only change the pixel count.

why would they lie, by lieing it damages sales. as such tvs just don't exist, especially as that was out at least 5 years ago if not 10.
It’s not perfect but you can watch the same film and change the resolution it steams at and see a large difference in Netflix. The same applies for getting a film and encoding it with all the same settings, bitrate and everything with the only difference being resolution size. Its not hard to encode a film at 720p and then again at the same settings at 1080p.

I have no idea why the chart is wrong but if as you said it’s at least 5 years old and it clearly doesn’t match up with reality. Even with a blind test I can see the difference between resolutions of the film and it’s not just me in lots of thread many people are saying they see a difference.
 
Soldato
Joined
29 May 2006
Posts
5,353
i wonder why people would say that, i wonder why people find new tvs so much better.
It’s got nothing to do with new setups even old setups and blind test show a difference. My 1440p screen is over 7 years old and even though it’s not full 4k there is an noticeable difference between 4k content and 1080p content.

A few months back my wife put Jack Reacher on my 100” screen without me knowing she picked a 720p version and I could instantly tell it was low resolution. At work I create a lots of short films normally at 1080p but it’s a simple thing to create 720p versions with all other settings the same and they too show a massive difference. Its not hard to download a 4k video and encode 3 versions with the same settings apart from 720p, 1080p and 4k.
 
Man of Honour
Joined
11 Mar 2004
Posts
76,634
keep thinking that but all you have is biased anecdotal evidence, so its gone from netflix which you cant control to you create your own stuff. basically all your examples you aren't just changing pixels.

there's a reason where ever you look all the data al though changes marginally between manufactures and companies change a little they are all within the ball park of that graph.

people can disagree as much as they want, but it's little difference to teh expensive digital cable brigade, see and hear what you want.
 
Man of Honour
Joined
9 Jan 2007
Posts
164,580
Location
Metropolis
I can see the difference on Netflix between HD and 4k at 13 feet with a 60 incher. I showed that graph to my wife before we bought it but she didn't want me getting a 100 inch screen, I'm not sure why:p.

It's amazing how you can use that graph to show your other half that your choice of 4k screen is the correct one for the living room. They will believe anything. :p:D
 
Soldato
Joined
29 May 2006
Posts
5,353
keep thinking that but all you have is biased anecdotal evidence, so its gone from netflix which you cant control to you create your own stuff. basically all your examples you aren't just changing pixels.

there's a reason where ever you look all the data al though changes marginally between manufactures and companies change a little they are all within the ball park of that graph.

people can disagree as much as they want, but it's little difference to teh expensive digital cable brigade, see and hear what you want.
You are very wrong here. How is my own example not just changing pixels when the only difference between the versions of my own film was changing pixels? I used identical films encoded with identical settings apart from resolution. How else would you test the resolution difference out?

How is it biased or anecdotal? I did extensive testing before setting up my cinema room and found resolution was one of the most important factors for image quality and it was very noticeable outside the ranges that graph says. I borrowed lots of equipment and did lots of blind tests both on myself and other people.

Netflix you can partially control there is an advanced options menu which lets you choose the streaming resolution and bitrate so you can watch the same film under 720p, 1080p or 4k when supported. As for my other testing I had identical films encoded in the same bitrate and settings with the only difference being resolution and there was a clear difference between them and I didn’t just use 1 film some of the tested films was films I created and others was films like Hunger Games Mocking Jay Part 1 encoded at different resolutions.

For my 3D graphic screensaver I tested out the same one at different resolutions via a PC and HDMI cable and it was night and day difference between 720p, 1080p or 4k both on the same display and different displays I tried a 720p native, 1080native but running at 720p, and 1080p native running 1080p, 1440p and so on. It was a very clear difference.

I tried a 27” screen size all the way to 92” to 100” screen at 720p 1080p and beyond. I admit a few people couldn’t tell the difference but in a blind test the majority of people I tested picked out with ease the higher resolution film and PC rendering graphics up to 6 meters away.

This is nothing like the difference between expensive digital cables there is a clear noticeable difference between 720p, 1080p and 4k on large screens. For myself I could tell the difference 100% of the time in a blind test and it was very noticeable.

Have you ever done a test and downloaded or created a 4k video and then encoded the content at the same settings apart from at 720p, 1080p and 4k. A few people cannot tell the difference but most can by large amount.
 
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
It’s not perfect but you can watch the same film and change the resolution it steams at and see a large difference in Netflix. The same applies for getting a film and encoding it with all the same settings, bitrate and everything with the only difference being resolution size. Its not hard to encode a film at 720p and then again at the same settings at 1080p.


Well that's obvious. if you do anything other than watch a native res encode then your doing a lot more than just watching the same thing at a different resolution.

those charts arent rubbish, they have at least some truth based on the limitations of human visual acuity but i have not ever checked them for accuracy. I think i'll have a look a the math behind it all and see if all adds up.

pottsey said:
How is my own example not just changing pixels when the only difference between the versions of my own film was changing pixels? I used identical films encoded with identical settings apart from resolution. How else would you test the resolution difference out?
One word; Scaling.
 
Last edited:
Permabanned
Joined
24 Apr 2014
Posts
5,258
Location
Caledonia
The difference between sitting back on my couch and leaning forward is significant with how much detail i can see. Unfortunately the mrs wont let me move the couch forward a foot because it would look weird sitting on the the edge of the rug. Women :mad:
 
Soldato
Joined
1 Mar 2010
Posts
21,916
(needs to be a sticky thread to exit groundhog discussion)
if you want to be objective download the Bale spreadsheet and can play :
- if you have 20/20 vision with a 55"can only resolve 4k pixels at < 3.6' distance or with 20/10 7.2' (20/20, 20/10 are 4th/2nd from eye chart bottom)
- with 20/20 eyesight 1080p 55" can see the pixels at less than 7.2'
- THX longest recommended distance for 55" tv is 6.1' (36degree viewing field)
reposting this too

.. but moreover looking up at TV screen from bed would be concerned by contrast drop viewing off-axis (if it is not an oled) :D
 
Soldato
Joined
29 May 2006
Posts
5,353
Well that's obvious. if you do anything other than watch a native res encode then your doing a lot more than just watching the same thing at a different resolution.

those charts arent rubbish, they have at least some truth based on the limitations of human visual acuity but i have not ever checked them for accuracy. I think i'll have a look a the math behind it all and see if all adds up.

One word; Scaling.
But I tested both by scaling and changing equipment with native and none native resolutions along with using 3D content produced by the graphics card so there was no scaling for that. Under every combo I could test with there was a noticeable difference between 720p, 1080p and higher. Under all the combos the higher resolution looked better and even in a blind test was chosen as the better one. Even at a distance where that chart says we cannot see a difference almost everyone choose the higher resolution correctly and said it looked better. A few people couldn't tell the difference between films at distance but everyone could tell the difference in 3D rendered graphics like http://www.serenescreen.com/images/product/maquarium3/Screenshot2.png at a distance.

Personally I don't find those charts accurate at all.
 
Last edited:
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
You had everybody in the room testing different panels back to back?

I don't know if I made it obvious but if you're scaling then forget it. Scaling and running non-native resolutions has a massive effect on picture quality. For example, 1080p on a 50" 4k panel is almost alwas worse than 1080p on a 50" 1080p pane; Non native resolutions ruin IQ.

Those charts are supposed to give you a baseline where all other things are equal. You can't then introduce scaling and oddball resolutions and wonder why the charts don't stack up.
 
Last edited:
Soldato
Joined
29 May 2006
Posts
5,353
You had everybody in the room testing different panels back to back?

I don't know if I made it obvious but if you're scaling then forget it. Scaling and running non-native resolutions has a massive effect on picture quality. For example, 1080p on a 50" 4k panel is almost alwas worse than 1080p on a 50" 1080p pane; Non native resolutions ruin IQ.

Those charts are supposed to give you a baseline where all other things are equal. You can't then introduce scaling and oddball resolutions and wonder why the charts don't stack up.
Although it was mostly projectors we did test a few panels like a 1440p screen and yes at times back to back, sometimes times side to side and other times separately. Sometimes without blind tests, sometimes with. In 2013 I did about 8 months of extensive testing not just looking at resolution although that was one key factor. We also looked at screen material, screen size from 27” to 100”, blackness, under different lighting conditions like summer and winter light. Some tests where repeated on different days. Yes we did test with scaling and without. We did non-native and native, sitting 6foot away to 6 meters away most of the time although one hall was 10+ meters.

To cut a long story short the conclusion we came to is there is a massive difference between 720p, 1080p, 1440p and higher. I even found cheap 1080p setups massively outperformed the high end expensive 720p setups due to resolution. More recently we even tested different screen material out from 0.8 grain to 1.5 grain with gray, white and back screen material.

Despite what Glaucus says he is wrong in saying there isn’t any benefit and wrong in saying it's other factors. Although other factors can have an impact it’s not all down to other factors a large amount of it comes down to resolution. One area which some people are going wrong with is even when you cannot see the full benefit often you can still see a noticeable benefit. You might not see the full benefit of 4k at certain distances but those distances can still be noticeably better at 4k then 1080p.
 
Last edited:
Man of Honour
Joined
11 Mar 2004
Posts
76,634
Haha so your first post you were testing Netflix, now all the way to that on steps after several people haven't pointed stuff out.
Despite the lies you keep making up, you are the one in the wrong.
 
Soldato
Joined
1 Mar 2010
Posts
21,916
How did you mitigate against the source material biassing the results ? -
If the source material was exclusively 4K that you re-encoded/scaled at 1080p or 720p, then due to the scaling algorithms themselves that is going to prejudice against the 1080p/720p IQ ( in the same manner as non-native resolution display), versus say having an 8K source to derive the 1080p/720p encodes, ... this is transcoding at the end of the day.
(for test need something like The Martian, 8K shot was it ? and then mastered at both 4K and 1080p)

Did you establish at what distance/screen size you did not see a benefit from 4K, so how should the chart be adjusted ?

Also in the projector domain, presumably results maybe very different to emissive screens, with reflections off of the screen at different angles from the point source projector, individual pixels experience a lot more blurring - no ? (I had read previously, with respect to cinema screens, you do not see strong pixellation close up like an emissive)
 
Soldato
Joined
29 May 2006
Posts
5,353
Haha so your first post you were testing Netflix, now all the way to that on steps after several people haven't pointed stuff out.
Despite the lies you keep making up, you are the one in the wrong.
Go back and reread as my first post with Netflix was also using a Blu-ray encoded down to lower resolutions. Encoding a film at the same settings to different resolutions was not something I added after several people pointed out stuff. So you are wrong again.

I brought up Netflix as it’s the easiest way for others to related to and do their own tests. It’s not perfect but it’s the easiest way to watch the same film at different resolutions without having to spend hours encoding. Just pick the resolution to stream at in the advance options. It’s not perfect but switching resolutions to stream at matches up with the same results I see when using a Blu-ray encoded at the same settings, same bitrate but at different resolutions or watch watching different native resolution content or when running a 3d program without scaling.

You skipped my question. Have you ever done any testing yourself? Have you ever tested the difference between 720, 1080 and 4k? Before you call what I say lies you should really give it a go and not just try films but also programs like http://www.serenescreen.com/product/maquarium3/info/screenshot.php which shows a massive difference between 720, 1080 and beyond.

As for lying I have been talking about projectors and my job role as a Technician for years. Not often granted but do you really think I spent years doing that just in case I needed a background to make stuff up in a thread like this!!! I have many photos over the years on my phone from the different test setups and finial setups from both at home and at work.

Everything in my experience says you are in the wrong and I see no decent evidence from you to prove otherwise. I have installed and tested a large amount of setups over multiple sites from small home setups to setups suitable for large halls. Even borrowed equipment to test out at home before choosing my own setup. What experience do you have?
 
Last edited:
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
(needs to be a sticky thread to exit groundhog discussion)
if you want to be objective download the Bale spreadsheet and can play :
- if you have 20/20 vision with a 55"can only resolve 4k pixels at < 3.6' distance or with 20/10 7.2' (20/20, 20/10 are 4th/2nd from eye chart bottom)
- with 20/20 eyesight 1080p 55" can see the pixels at less than 7.2'
- THX longest recommended distance for 55" tv is 6.1' (36degree viewing field)
reposting this too

.. but moreover looking up at TV screen from bed would be concerned by contrast drop viewing off-axis (if it is not an oled) :D

Exactly this. There is a reason i said the chart is based on visual acuity and that's because it is. Pottsey read it and absorb it all, it's good information. They only thing worth adding is that it's all based on 20/20 vision but people can of course exceed that. 20/20 is the fourth line from the bottom on the snellen chart (those letters you read when you have a vision test). It's not uncommon for people to be able to read every line (which would equate to 20/10 or 20/5, not sure which) be that without glasses or after correction.
 
Last edited:
Man of Honour
Joined
11 Mar 2004
Posts
76,634
wow more BS, i don't need to do and i cant (just like you) perform tests, which due to many errors, which you keep ignoring then when told keep coming up with bigger lies.

go read anything about visual acuity and what your eye can resolve and then field of view etc, this is what you need to know and what the charts are based on, this is scientific and doesn't include a shed load of other factors. you can not beat the science and maths, you are just plain and simply wrong.

for a 65" 4k tv then ideal viewing distance is 4.3ft which is based on the following
The Ideal Viewing Distance based on Visual Acuity. This distance is calculated based on the reference resolving power of the eyes. The human eye with 20/20 vision can detect or resolve details as small as 1/60th of a degree of arc. This distance represents the point beyond which some details in the picture are no longer able to be resolved, so pixels begin to blend together. Closer to the screen than this may result in the need for higher resolution display. This value should be lowered if visual acuity is worst then 20/20, raised if visual acuity is better.

which, wait for it, marries up with the chart.
 
Soldato
Joined
29 May 2006
Posts
5,353
“wow more BS, i don't need to do and i cant (just like you) perform tests, which due to many errors, which you keep ignoring then when told keep coming up with bigger lies.”
What errors and what lies? I have not lied once.

Notice how it says some detail in the picture is no longer able to be resolved. That doesn’t mean we suddenly lose all the extra detail and are no longer able to see any difference. As I said before I was not talking about seeing every single pixel and the full benefit. I was talking about a noticeable difference and being able to tell which is which 720p. 1080p and beyond.

You keep missing that the ideal viewing distance doesn’t mean people cannot see a difference outside the ideal range. When doing my blind tests 100% of people could tell the difference in resolution with that 3D screensaver so no scaling and close to 100% when watching films. It helped that everyone I tested was under 50 but still it was a clear fair test that shows there is a difference with 4k.



““for a 65" 4k tv then ideal viewing distance is 4.3ft which is based on the following”

4.3 is the minimum distance to view 100% of the extra detail. Based on normal eyesight at about 20/14 at age 20 which slowly deteriorate to 20/19 about age 75 your average healthy 20 to 50 year old will be able to see a 4k difference out to 10 feet away and beyond with a 65” 4k TV. They might not see all the extra detail but enough to make a difference even out to 10 feet. If you can read any of the bottom 3 rows in the snellen chart which is common for under 50’s can then you should see a difference.
 
Last edited:
Soldato
Joined
20 Jun 2004
Posts
5,903
Location
Essex
Everything in this thread suggests a massive waste of time.

The chart is correct.

Most aren't even getting the full benefit of 1080p in UK living rooms and 4K on a 55" tv is pretty unimpressive as it's too small a screen size.

4K projectors need to come down in price!
 
Last edited:
Man of Honour
Joined
11 Mar 2004
Posts
76,634
i dont know perhaps all teh errors point out in this thread that you havent done anything about.

well done i even said it was a minimum, but guess what it tallys up with the chart but you say the chart is wrong. you can do the calculation for the other end if you wish.

the only person who hasn't given any evidence is you. there is plenty of information out there for acurity, field of view etc, go read it. no one else is agreeing with you, I wonder why.
 
Back
Top Bottom