A HD Question....

Soldato
Joined
7 Nov 2006
Posts
5,678
Location
Stockton on Tees
Just came across the following statement:

We should point out that all of these sets display a native 1366 x 768 pixels, so "1080 HD" content (such as 1080i HD TV channels or 1080p Blu-ray or HD DVD movies) has to be scaled down to suit the screen, which robs the content of some of its richness.

I have a 37" Samsung (1080i HD) which has a native resolution of 1366 x 768. Does this mean because it will scale down the blu-ray content, the picture will suffer? and can it be improved?
 
Depending on how close you are sitting the difference between a 1080p picture and a 720p picture can be very hard to tell between the two, I think most people say that after about 3M unless you are using a very very large screen they look more or less alike.

IMO you are better off sending something at the native resolution of your screen rather than resampling it, so in your case I would just use a 720p signal (given that progressive scan is supposedly also better quality than an interlaced signal).
 
Yes, it will downscale 1080i and upscale 720p.

Not really much you can do about it.

It is still beyond me quite why they ever chose to use that resolution, it makes absolutely no sense at all.
 
sending the native res is always the most preferable but in the case of a '720p' panel, more often than not thats actually 1366x768, aka 768p. devine is quite right, its a ridiculous resolution to have chosen for 720p content as it means virtually everything outside of a 768p signal sent from a pc will have to be resampled, and as there is no 768p content that i know of....well you get the idea. ive no idea why they chose that either.


the next best thing in theory is to send a signal larger than the resolution of the panal so the panel can then reduce it. ie, send 1080i or p and let it scale down. this is preferable to sending a lower resolution and scaling up. you can make something sharper and 'clearer' by scaling down but you cant do the same scaling up and so the picture can suffer.

however..... most (if not all) screens that arent 1080p panels wont accept a 1080p signal, instead the maximum they'll accept is 1080i. this is fine but its common to find panels that dont cope with interlaced signals very well (fast action, panning camera scenes are affected, amongst other things) so you may very well find that the panal actually does better over all when upscaling a 720p signal. this issue is further complicated when playing blurays or hd-dvds as, for a '720p' panel, the player would first have to scale down to 720, and then the tv itself would scale it back up to 768p. crazy.

your mileage my vary but try it all and see what works best.
 
Last edited:
you cant possible quantify it, theres too many variables. its dependant on the player's scaler and the tv's scaler, the tv's ability to handle interlaced signals...it really is down to try it and see, unless somebody with the same tv and setup can give you their findings
 
Just output whatever, it will still be vastly superior to 576p signal whether on an SD set, or upscaled onto an HD set.

And I would think the reason they went for 1366x768 is because back then most panels were made for monitor use, and it is the widescreen version of 1024x768. Tus it was too costly to make some panels 1366x768 and other 1280x720, when the ascpect ratio is about the same.
 
It would just have been nice to have some consistancy.

Panels easier to make in 768p? Make 768p the HD res then, rather than 720p, or if 720p is a lot easier to make than 768p, make the panels 720p.

There is no good reason they shouldn't have matched up really.
 
It is still beyond me quite why they ever chose to use that resolution, it makes absolutely no sense at all.


well broadcast high def will be interlaced and 1080 content which is delivered over the air will will arrive 1080i.

for someone who only watches stuff via a hd player ,there will be re sampling.

it sounds like it is an early model as high def players were too expensive for consumers few years back and t.v manufacturer made tv's just for broadcast .
 
That doesn't make sense, 1080i is still 1920x1080 and will still need to be downscaled to 1366x768, broadcast or not :confused:
 
That doesn't make sense, 1080i is still 1920x1080 and will still need to be downscaled to 1366x768, broadcast or not :confused:

i was wrong, when the poster i quoted said he didn't know why they chose that resolution, i though he meant 1080i but i see now he meant 1366x768.

i thought his question was, why interlaced high def tv's at all.
 
On a 37" screen you just won't see a difference.

I don't know why they make TV's with 768 lines as opposed to 720 but there are no negatives to more resolution, only advantages. It means less downsampling if the set accepts a 1080p signal and if it only accepts 1080i/720p then upscaling only improves the image.
 
On a 37" screen you just won't see a difference.

I don't know why they make TV's with 768 lines as opposed to 720 but there are no negatives to more resolution, only advantages. It means less downsampling if the set accepts a 1080p signal and if it only accepts 1080i/720p then upscaling only improves the image.

this is where you are going wrong. 768p means a 1080i/p source will either be scaled down once or scaled down and then back up again. scaling twice doesnt improve an image at all. you can use fancy trickery to scale up, such as ffdshow on the pc, which does offer improvements over the scalers built in to display panels, but you can only go so far. scaling a signal up would never match the quality of a signal at a higher native resolution. in the world of Hd displays and video, 768p doesnt have any positive factors that i can see.
 
this is where you are going wrong. 768p means a 1080i/p source will either be scaled down once or scaled down and then back up again. scaling twice doesnt improve an image at all. you can use fancy trickery to scale up, such as ffdshow on the pc, which does offer improvements over the scalers built in to display panels, but you can only go so far. scaling a signal up would never match the quality of a signal at a higher native resolution. in the world of Hd displays and video, 768p doesnt have any positive factors that i can see.

That's exactly what I said except you're mistaken about 720>768 not offering an improvement and I didn't say anything about it matching the quality of a signal at a higher resolution.
Unless a display has really disasterously awful scaling (some rather ancients displays do) then upscaling is always advantageous and will provide a better image than if the panel had 720 lines.
 
Back
Top Bottom