Thereâ€™s a lot of talk going round at the moment about 1080p compatibility and the fact that current TVs donâ€™t support this standard. Itâ€™s true that current HDTVs canâ€™t support a 1080p input, but this really shouldnâ€™t be anything to worry about. As I mentioned in the last page, if you feed an LCD or plasma TV a 1080i signal, it will de-interlace that signal and present you with what is essentially a 1080p picture.
Theoretically, de-interlaced 1080i footage should look identical to native 1080p footage as long as no post processing steps are applied during the original conversion process that could degrade the quality. Itâ€™s also worth remembering that even movies that are shot digitally in a 1080p 24fps format will need to undergo some conversion to meet the 25fps PAL standard or the 30fps NTSC standard.
There are also the issues of bandwidth and processing power. A 1080p 50Hz signal is going to use significantly more bandwidth than a 1080i 50Hz signal. Of course you could reduce the amount of bandwidth by implementing a high compression codec like h.264, but then youâ€™re going to need some pretty beefy hardware to decode the video once you receive it. This makes it highly unlikely that weâ€™ll see 1080p content via broadcasters, although that doesnâ€™t mean that it wonâ€™t become available via distributed media like Blu-ray or HD DVD.
Ultimately though, as more and more movies are shot digitally in 1080p (Star Wars Episode III: Revenge of the Sith was shot in 1080p24), the demand for source devices that output 1080p and HDTVs that accept 1080p natively will grow. Whether this should sway your buying decision now is debatable, but ultimately it depends on how long you're willing to sit on the fence.