That's because your TV or STB is de-interlacing it like you said. Only CRT monitors can display interlaced content properly and use the full benefits of it.
Depending on the method it uses, the fields are blended in some way to make a progressive full frame. The perceived frame rate will also depend on how it is de-interlaced (50fps or 25fps).
Technically, there is more data in a 1080i field than a 720p frame but the 1080i field will be a half frame and thus half the data is missing. When it is displayed it will be de-interlaced. The quality of the de-interlacing depends on the equipment. Methods such as Bob can be used to predict the data lost but this is processor intensive so the fields are most likely just blended in some way. You most likely won't see any combing because it would have been blurred when de-interlaced. Ultimately, this will mean that 1080i will just appear more blurry than 720p vertically but with have more horizontal detail.
Whether you should use 720p over 1080i varies depending on your equipment.

If the signals were displayed in their native resolution per pixel with no scaling or post processing then 720p will appear a bit more sharper. Personally, if the video signal is sent as 1080i then I would display it as such and vice versa to avoid any scaling of the image. They should probably use 1080p anyway because there aren't any real bandwidth benefits compared to 1080i when the video is compressed. Maybe they're saving using 1080p to expand product longevity even when it could already be used.
Also, another thing to consider is that some stations use a reduced horizontal resolution to save bandwidth. Whether they do it with the HD channels I don't know but it's quite prevalent on SD channels. PAL is 720x576 but most of the channels only use 544x576.