Interlace
From Wikipedia, the free encyclopedia.
- For the method of progressively displaying a raster graphics, see Interlace (bitmaps).
Interlacing is a method of displaying images on a raster-scanned display device, such as a cathode ray tube (CRT), in which the display alternates between drawing the even-numbered lines and the odd-numbered lines of each frame. The word also refers to any method of storing or transmitting video in which the odd and even fields are separated.
Contents |
Description
24-30Hz is a sufficent video frame rate for smooth motion, and transmitting any more would be a waste of radio bandwidth. Indeed, the early televisions of the 1920s contained CRTs which scanned at this low frequency, drawing every line of the screen progressively from top to bottom.
However, the image on a CRT decays very quickly, and scanning at such a low frequency produces noticeable flicker and tires the eyes quickly. A scan rate of at least 50Hz is required for comfortable viewing, with 80Hz or more being preferable. This mismatch could be solved by displaying the signal at a higher scan rate than its frame rate, but that would require using a frame buffer, a method that did not become feasible until the 1980s.
Therefore, the interlacing method was adopted. Instead of displaying a sequence of frames, the screen displays a sequence of "fields". One field contains only the odd-numbered lines (forming the odd field), and the next contains only even-numbered lines (forming the even field). Because of persistence of vision, pairs of fields are perceived at the same time, giving the appearance of a full frame.
Odd field Even field
Interlaced scanning at 60 fields per second normally gives considerably less flicker than progressive scanning at 30 frames per second. Its main disadvantage is a perceptible loss in vertical resolution, which is most obvious when displaying narrow horizontal patterns. If these become less than two lines wide, the odd and even fields will no longer be similar, and the pattern will flicker at one half of the field rate. This effect is called "twitter".
One common misconception is that an odd and even pair of fields represent a single frame. In fact, the camera scans using the same pattern as the CRT, reading even lines from its sensor only after it has finished reading the odd lines. Thus, in a 50 fields per second system, lines 122 and 124 are read approximately one fiftieth of a second after lines 123 and 125 were read. If the odd and even fields were simply combined into a single progressive frame, any parts with horizontal motion would display visible "combing" on their edges.
The alternatives to interlacing (assuming a directly-driven CRT display) are:
- Doubling the bandwidth used and transmitting full frames instead of each field. This produces little improvement in picture quality, since the effective resolution and flicker rate are the same.
- Using the same bandwidth, but transmitting progressive frames with half the amount of detail. The flicker rate remains the same.
- Using the same bandwidth, but transmitting a full progressive frame instead of every two fields. The eye suffers more fatigue (eye-strain) than when viewing the interlaced display, because the flicker rate halves.
- As above, but using a digital frame buffer to display each frame twice. This provides the same flicker rate as the interlaced signal, but with less smooth motion.
In modern monitors and television sets, interlacing is being slowly superseded as the refresh rate of non-interlaced CRTs increases beyond the level at which flicker can be detected. Non-scanning display devices such as LCDs and plasma screens are also becoming common. To display an interlaced signal on any non-interlaced device without "combing", a computationally expensive deinterlacing process is required.
Technical details
In an interlaced system, lines are drawn diagonally such that the right end of each line is two lines lower than the left end. The offset between the two fields is then produced by having both an odd number of overall lines and vertical flyback between the odd and even fields occur halfway through one line. For example, in PAL, the blanking period starts after 292.5 lines of the odd field have been transmitted, and lasts for 20 lines. When scanning begins again at the top of the screen, the scanning beam is still halfway across the picture. Because of the slant, the centre top of the picture is one line above the line begun at the top left corner.
Interlacing is used by all the analogue TV broadcast systems in current use:
- PAL: 50 fields per second, 625 lines, even field drawn first
- SECAM: 50 fields per second, 625 lines
- NTSC: 59.94 fields per second, 525 lines, even field drawn first
Interlacing as a data compression technique
When discussing the television technology, it is important to note that it is used today to transmit two fundamentally different kinds of moving images: the so called film-based material, where the image of the scene is captured by camera 24 frames a second, and the video-based material, where the image is captured 50 or ~60 frames a second. The 50 and 60 Hz material captures motion very well, and it looks very fluid on the screen. The 24 Hz material, in principle, captures motion satisfactory, however because it is usually displayed at least at twice the capture rate in cinema and on CRT TV (to avoid flicker), it is not considered to be capable of transmitting "fluid" motion. It is nevertheless continued to be used for filming movies due to the unique artistic impression arising exactly from the slow image change rate.
25 Hz material for all practical purposes looks and feels the same as 24 Hz material. 30 Hz material is in the middle between 24 and 50 Hz material in terms of "fluidity" of the motion it captures, however it is also handled in TV systems similarly to 24 Hz material (i.e. displayed at at least twice the capture rate). (The detailed discussion of the formats is out of scope of this article).
This section deals specifically with the transmission of 50/60 Hz material, also called video-based material from the camera to the screen. Transmission of 24/25/30 Hz material has been described above.
For the video-based material, the interlacing is, in fact, an archaic lossy perceptual image compression technique. Half the lines are dropped from each captured full frame by the compressor, thus achieving a fixed 2:1 compression ratio. This technique exploits the persistence of vision property of human perception, so that the eye and brain together act as the decompressor.
Interlacing as compression can be applied both in analog and in digital domain.
For the purpose of reducing the bandwidth necessary to transmit the video-based material, interlacing is inferior to the modern digital block-based compression techniques for the following reasons:
- Interlacing performs poorly on moving images. Moving sharp objects suffer especially.
- Any decompression done in the display (as opposed to perceptually by the viewer) is computationally very expensive, particularly when tasked with avoiding or suppressing noticeable artifacts. Simple and widely used line-doubling decompression techniques produce artifacts that some viewers consider objectionable. All modern display technologies, apart from CRT, require uncompressed progressive images.
- Video material suffers terribly when edited, or, alternatively, the selection of effects which do not lead to quality loss on recompression is very restricted.
- Progressive MPEG is flexible and adaptive about which details of the image it compresses and how much, while the compression by interlacing does not discriminate about the perceptual complexity of the element of the image being compressed. Moreover, the quality of consumer-grade deinterlacers vary alot, while the MPEG decoder is absolutely deterministic in regard to the decompression of the progressively compressed stream.
Interlacing can, and unfortunately is combined with other compression techniques in digital domain. The combination of interlacing with block-based compression technique inherits all the drawbacks of the interlacing, while also reducing the efficiency of block-based compression. Because interlacing samples every other line without prefiltering, it increases the amount of high-frequency components in the signal fed to the block transformation. This leads to lower efficiency of block transformation (i.e. DCT), or alternatively increases the amount of artifacts after decompression. This also decreases the effectivness of the motion compensation technique, used in the interframe compression formats like MPEG.
When vertical color compression (also called decimation or color subsampling) is included to the combined compression system, it is further effectively compressed by the interlacing. And vertical color subsampling is almost always included into digital and analog television systems (with the exception of broadcasted NTSC, and (subject of controversy) broadcast PAL), all over the world. Thus with 4:2:0 color compression technique (i.e. half horizontal and half vertical resolution) the vertical colour resolution drops from 1:2 to 1:4, and overall color resolution from 1:4 to 1:8.
It is sometimes claimed that combining MPEG compression with interlacing reduces the amount of processing power required from the MPEG decoder almost in half. However, this argument does not stand when faced with the immense processing power needed for unobjectionable deinterlacing of the the image after MPEG decompressor; and all modern displays but the (gradually disappearing) CRTs require progressive image as its input.
Another argument is that combining interlacing with MPEG drives up the overall sweet spot of the compression system. (Note though, that the sweet spot does not get close to doubling, due to the inefficiencies described above.) Specifically, it makes it possible to transmit 1920x1080 60 Hz video over the broadcasting bit pipe choosen for the ATSC system. However, essentially the same effect on the sweet spot without the drawbacks of interlacing could be achieved by simply prefiltering high frequencies out before applying a progressive MPEG compression; or, less efficiently, by filtering out high-frequency components from the compressed MPEG stream right before injecting it into the broadcasting pipe. On the other hand, most DVB flavours (T, S) offer a suitable bit pipe already today, and a better terrestrial broadcasting technology could have been selected for ATSC too.
Yet another argument is the concern about the 2 times higher technological complexity of the camera and production equipment in case of progressive video, due to twice the uncompressed bitrate at the moment of capture, and twice the computational power needed for processing and compression. This argument will literary become obsolete tomorrow, with the progress of science and technology; and the TV producers today or tomorrow are not very sensitive to the capital costs. However with interlacing today, we will end up with many works of art and television recordings of important events spoiled by it.
Despite these arguments, and the calls by many prominent technological companies, such as Microsoft, to leave interlacing to history, the interlacing maintains a strong grip on the telelvision standard setting bodies, still being included in new digital video transmission formats, such as DV, DVB (including its HD modifications), and ATSC for the purpose of compressing video-based material.
Interlacing and digital compression
Technology moves forward, but television standards, as well as the works of art created or edited to fit them, are here to stay for a long time. Because interlaced devices and signals remain widespread, even modern digital video compression techniques have been designed to accomodate them.
Digital compression deals with interlaced video by processing each field separately, encoding it relative to other fields of both parities. However, compared to encoding progressive video, the reduction in vertical resolution results in more high-frequency components in the signal, reducing the efficiency of block transformation and motion compensation.
Interlacing is still being included in new digital video transmission formats, such as DV, DVB (including its HD modifications), and ATSC. It is sometimes claimed that combining MPEG compression with interlacing reduces the amount of processing power required from the MPEG decoder almost in half. However, all modern displays but the (gradually disappearing) CRTs require progressive images as input, so this argument does not stand when faced with the immense processing power needed for unobjectionable deinterlacing of the the image after MPEG decompression.
Another argument is that combining interlacing with MPEG drives up the overall "sweet spot" of the system, getting the maximum possible quality out of the bit pipe offered by a particular contemporary broadcasting solution (specifically the 1080i format in ATSC). However, essentially the same effect on the sweet spot without the side-effects of interlacing could be achieved by simply prefiltering high frequencies out before applying progressive MPEG compression.
See also
- Progressive scan: the opposite of interlacing; the image is displayed line by line.
- Deinterlacing: converting an interlaced video signal into a non-interlaced one
- Telecine: a method for converting film frame rates to television frame rates using interlacing
- Federal Standard 1037C: defines Interlaced scanning
- Interleaving
External links
- 100FPS.COM* - Video Interlacing/Deinterlacing