720p better than 1080i

Won't this depend whether your TV or STB is better at deinterlacing?

To an extent, but if it's a 1080p TV and you choose 720p on the STB you're going to be deinterlacing, scaling down to 720p, sending the signal out and then scaling it back up to 1080p once it hits the set. More efficient to send it out at 1080i to the set and do any postprocessing there.
 
Only if you're doing it wrong. :)

I'd say a snooker ball is very fast moving, so why didn't I see a single comb artifact during the whole World Championship on Eurosport HD?

I'd say Top Gear is very fast moving, so why haven't I seen a single comb artifact on BBC HD?

The answer is, the TV is recognising that it's an interlaced signal coming from the V+HD box, and is displaying it properly.

If you play an interlaced file over a progressive signal (eg. A monitor connected to pc) then you will need to enable software de-interlacing in the video player. This is because the monitor/TV wont know the file is interlaced, due to the fact that the signal it is receiving is progressive. This is when you will get artifacts.

That's because your TV or STB is de-interlacing it like you said. Only CRT monitors can display interlaced content properly and use the full benefits of it.

Depending on the method it uses, the fields are blended in some way to make a progressive full frame. The perceived frame rate will also depend on how it is de-interlaced (50fps or 25fps).

Technically, there is more data in a 1080i field than a 720p frame but the 1080i field will be a half frame and thus half the data is missing. When it is displayed it will be de-interlaced. The quality of the de-interlacing depends on the equipment. Methods such as Bob can be used to predict the data lost but this is processor intensive so the fields are most likely just blended in some way. You most likely won't see any combing because it would have been blurred when de-interlaced. Ultimately, this will mean that 1080i will just appear more blurry than 720p vertically but with have more horizontal detail.

Whether you should use 720p over 1080i varies depending on your equipment. :) If the signals were displayed in their native resolution per pixel with no scaling or post processing then 720p will appear a bit more sharper. Personally, if the video signal is sent as 1080i then I would display it as such and vice versa to avoid any scaling of the image. They should probably use 1080p anyway because there aren't any real bandwidth benefits compared to 1080i when the video is compressed. Maybe they're saving using 1080p to expand product longevity even when it could already be used.

Also, another thing to consider is that some stations use a reduced horizontal resolution to save bandwidth. Whether they do it with the HD channels I don't know but it's quite prevalent on SD channels. PAL is 720x576 but most of the channels only use 544x576.
 
OO
That's because your TV or STB is de-interlacing it like you said. Only CRT monitors can display interlaced content properly and use the full benefits of it.

Depending on the method it uses, the fields are blended in some way to make a progressive full frame. The perceived frame rate will also depend on how it is de-interlaced (50fps or 25fps).

Technically, there is more data in a 1080i field than a 720p frame but the 1080i field will be a half frame and thus half the data is missing. When it is displayed it will be de-interlaced. The quality of the de-interlacing depends on the equipment. Methods such as Bob can be used to predict the data lost but this is processor intensive so the fields are most likely just blended in some way. You most likely won't see any combing because it would have been blurred when de-interlaced. Ultimately, this will mean that 1080i will just appear more blurry than 720p vertically but with have more horizontal detail.

Whether you should use 720p over 1080i varies depending on your equipment. :) If the signals were displayed in their native resolution per pixel with no scaling or post processing then 720p will appear a bit more sharper. Personally, if the video signal is sent as 1080i then I would display it as such and vice versa to avoid any scaling of the image. They should probably use 1080p anyway because there aren't any real bandwidth benefits compared to 1080i when the video is compressed. Maybe they're saving using 1080p to expand product longevity even when it could already be used.

Also, another thing to consider is that some stations use a reduced horizontal resolution to save bandwidth. Whether they do it with the HD channels I don't know but it's quite prevalent on SD channels. PAL is 720x576 but most of the channels only use 544x576.

What you are describing is technically correct when comparing old school 1080i50 vs 720p50, but it is my understanding of it thee days, that content is now encoded before transmission as 25 full progressive frames, sent as 50 interlaced fields. I believe the term is PsF (Progressive segmented frames), this was to allow higher efficiencies with modern forecast, but I may have been misled, most of my information comes from STB engineers, and work done when I was an engineer for toshiba.

If they are using classic interlacing with 50 temporally distinct frames then modern TVs are superb at deinterlacing, and they seem to give a good impression of retaining a full frame of detail and no perceivable temporal artefacting!
 
Last edited:
OO

What you are describing is technically correct when comparing old school 1080i50 vs 720p50, but it is my understanding of it thee days, that content is now encoded before transmission as 25 full progressive frames, sent as 50 interlaced fields. I believe the term is PsF (Progressive segmented frames), this was to allow higher efficiencies with modern forecast, but I may have been misled, most of my information comes from STB engineers, and work done when I was an engineer for toshiba.

If they are using classic interlacing with 50 temporally distinct frames then modern TVs are superb at deinterlacing, and they seem to give a good impression of retaining a full frame of detail and no perceivable temporal artefacting!

Yeah, I do believe that pretty much everything uses PsF now. :) This is probably why companies like Microsoft want interlacing gone because it's redundant apart from CRT legacy support. I think it was introduced because CRT monitors had to have a high refresh rate to prevent flicker and there wasn't sufficient bandwidth to broadcast at 50 full frames a second. Since this isn't needed anymore there's no reason to keep using it. They may have introduced 1080i mainly because there are some HD CRT displays out there but I dunno.

Some stuff on TV is still proper interlaced material like old shows and live feeds but it'll probably all go soon, especially with HD content. I guess in the case of 1080i/50 when the content is PsF it'll be equal in resolution to 1080p/25. So 1080i/50 IS basically 1080p/25 at most of times. :p
 
Back
Top Bottom