TV Tuners - Signal Processing
Signal processing generally occurs after the analog to digital conversion. The base line PVR card uses dual 9-bit ADCs, but some of the cards use dual 12-bit ADCs. We should note that these are not the parameters of all PVR cards, rather a generalization of those on the market today. For those curious, there are two ADCs, because one is used for audio and one for video. If an A/V processor spec says there are 9-bit ADCs, you should assume that there are two.
Example of ADC differences
Hauppauge WinTV PVR-250 -> 001010101 -> 9-bits
NVIDIA GeForce FX 5600 Personal Cinema -> 0101010011 -> 10-bits
ATI All-in-Wonder Radeon 9600 Pro -> 010101010010 -> 12-bits
As the bit spec of the ADC increases we obviously see the major benefit of improved signal clarity, quality, etc. However, we should note that there is a side benefit. With the ADCs providing more information to the system, the video processor has more data to work with to cutout the line twitter and luminance processing per line.
Generally speaking interlaced video images are made up of 60 fields per second. Each field contains half the number of lines of an image, so alternating fields complement one another to comprise a full picture.
And if you don't already know, interlacing doesn't do the job so well because TV isn't made up of still images. People move around and interact with their background, which means that alternating fields don't work well in this setting. Moving images in an interlaced video environment generates interlaced artifacts, which are very much like those in a video game. The only difference is that interlaced artifacts take on a two distinct attributes, both of which have lines of one frame that are out of step with its complement (motion related).
Feathering (also known as comb filtering, "sawtooth" artifacts for example), is the more noticeable of the two, and occurs when an object in an image moves (see the tomato example in this article). However, the problem comes about when objects have moved when the first field is captured and when the next field is captured a 1/60th of a second later. In the multimedia industry, this is commonly known as "combing" in reference to the tines on a hair comb. In a real world application of broadcast television, feathering might look like the picture of the bridge cable in this article.
The other type of interlaced artifact is the more pervasive and annoying of the two, in the sense that there is less you can do about it, is called line twitter or line flicker. This type of interlaced artifact occurs when an object or edge of an object is only the width of a few scan lines or less. One example is when someone is shooting a handgun in a scene on TV. When the gun kicks back on a recoil, you see the space where the barrel of the gun is flickering quickly, as the gun settles back into its original aim position.
Conceptually speaking, this isn't only related to a thin object or the object's edge. This line twitter occurs when there is an abrupt change in brightness between two, or a few, horizontal scan lines (see this example of a skier in motion. Notice where the ski pole is "flickering" in and out of the image without smooth transition). In addition, as the viewing space increases (size of your TV screen), the more noticeable line twitter and feathering become, since the interlaced scanned lines increase in width. Sitting further away from the image should also help reduce the effect of either on your eyes.
Progressive scanning (i.e. HDTV resolutions like 480p and 720p) eliminates these two artifacts in one sweep, since every line is refreshed at once. Effectively, this makes progressive scanning twice as fast as interlaced. So when we get to a point where HD digital tuner PVR cards are commonplace, a lot of this worry and patchwork of quick fixes won't be applicable. Until then, let's look at how current technology tackles this problem.