Ericuse165 asked the HDTV & Home Theater forum about the "little black lines and spots" that appear on his HDTV screen while watching fast-moving sports action. He wondered if the problem was in the connection between his set-top box and his television.
The problem sounds like interlacing, which is often part of the signal as it comes into your house. It's also possible, however, that your set-top box is adding interlacing to a progressive (non-interlaced) signal. The right setting may be able to fix that.
Each frame of an interlaced signal contains only half of the image's horizontal lines. A frame containing only the odd-numbered lines follows the one with only the even-numbered lines. Interlacing has been part of television since the dawn of the medium.
We currently have two standard HDTV resolutions for broadcast, cable, and satellite television: 1080i and 720p. 1080i uses the full resolution of today's televisions, but it's interlaced. 720p doesn't have interlace problems, but it has less than half your HDTV's probable resolution. As a general rule, 1080i looks better for static or slow-moving images, while 720p does a better job with fast motion.
For that reason, a great many sports programs are broadcast in 720p. But not all of them.
(A 1080p signal offers the best of both worlds, but there are few 1080p sources outside of Blu-ray discs.)
Neither of these broadcast standards can display directly on today's 1080p HDTVs. Before they hit the screen, 720p signals must be upconverted to 1080p, and 1080i signals must be de-interlaced. Both of these conversions can be done either by the HDTV itself, by the set-top box (or the DVR, which is basically a set-top box with a hard drive).
And therein lies the problem: If your box converts 720p to 1080i or vice versa, it's degrading the image. Besides, your HDTV probably does a better job at converting the images than does your set-top box or DVR.
Changing how the box outputs its video signal can help this problem. I don't know what settings are available on your particular set-top box or DVR, but here are some common options and what they will do to the signal:
Native resolution: The box sends the signals to the HDTV without processing it, letting the HDTV do the upconverting and de-interlacing. If your box offers this option, it's probably your best bet.
1080p: The box de-interlaces the 1080i signal and up-converts the 720p. This is probably your best bet if the box doesn't offer native resolution, or if you're not satisfied with the native resolution results. Note that you'll have to use an HDMI connection for this option to work.
1080i: This is a pretty serious compromise if you're watching stations that broadcast in 720p, because it will add interlacing to a signal that otherwise doesn't have it. It's fine for a 1080i signal, of course, since it sends the signal unchanged to the HDTV.
720p: The reverse of the above. Your 720p broadcasts are sent unchanged to be upconverted by the HDTV, but your 1080i images lose more than half their pixels--while getting a de-interlacing that's probably inferior to the one your TV would give them.
If your box doesn't offer either of the first two settings, you should pick the option that appears, in your judgement, to do the least harm.
My thanks to Waldojim who, in the original forum discussion, identified Eric's problem as interlacing. He also provided a very good description of interlacing which I tried hard not to steal when I wrote my own.
Contributing Editor Lincoln Spector writes about technology and cinema. Email your tech questions to him at email@example.com, or post them to a community of helpful folks on the PCW Answer Line forum.
This story, "The HDTV Interlace Problem" was originally published by PCWorld.