What about 1080p?

There’s a lot of talk going round at the moment about 1080p compatibility and the fact that current TVs don’t support this standard. It’s true that current HDTVs can’t support a 1080p input, but this really shouldn’t be anything to worry about. As I mentioned in the last page, if you feed an LCD or plasma TV a 1080i signal, it will de-interlace that signal and present you with what is essentially a 1080p picture.

Theoretically, de-interlaced 1080i footage should look identical to native 1080p footage as long as no post processing steps are applied during the original conversion process that could degrade the quality. It’s also worth remembering that even movies that are shot digitally in a 1080p 24fps format will need to undergo some conversion to meet the 25fps PAL standard or the 30fps NTSC standard.

/94/e3d5a2/e3ef/3193-resievil.jpg

There are also the issues of bandwidth and processing power. A 1080p 50Hz signal is going to use significantly more bandwidth than a 1080i 50Hz signal. Of course you could reduce the amount of bandwidth by implementing a high compression codec like h.264, but then you’re going to need some pretty beefy hardware to decode the video once you receive it. This makes it highly unlikely that we’ll see 1080p content via broadcasters, although that doesn’t mean that it won’t become available via distributed media like Blu-ray or HD DVD.

Ultimately though, as more and more movies are shot digitally in 1080p (Star Wars Episode III: Revenge of the Sith was shot in 1080p24), the demand for source devices that output 1080p and HDTVs that accept 1080p natively will grow. Whether this should sway your buying decision now is debatable, but ultimately it depends on how long you're willing to sit on the fence.

comments powered by Disqus