All color television systems use the principle of additive colors, with green, blue and red as primary colors. The precise colorimetry coordinates are set in relevant standards. Monochrome compatibility requires the generation and transmission of a full-bandwidth signal representing the brightness component of the televised scene. This component is called the “luminance.” The mathematical expression for the luminance signal is:
E′Y = 0.587 E′G + 0.114 E′B + 0.299 E′R, where
E′Y = The gamma-corrected voltage corresponding to the luminance information
E′G = The gamma-corrected voltage corresponding to the green information
E′B = The gamma-corrected voltage corresponding to the blue information
E′R = The gamma-corrected voltage corresponding to the red information
In a studio environment, the bandwidth of the luminance signal is restricted only by the state of the art of the equipment used. Normally, the bandwidth of the luminance signal generated by the camera is at least 8 MHz, or a horizontal resolution in excess of 600 LPH.
The chrominance information is conveyed by two of the three primary signals minus the brightness component. These signals are known as the blue and the red color-difference signals. They are:
E′B - E′Y = -0.587 E′G + 0.889 E′B - 0.299 E′R
E′R - E′Y = -0.587 E′G - 0.114 E′B + 0.701 E′R
The E′G - E′Y signal can be recreated in the receiver by a suitable combination of the blue and red color-difference signals.
The color-difference signals are scaled in amplitude by suitable multiplication factors to avoid transmitter overloading. The NTSC scaled color-difference signals are:
E′B-Y = 0.493 (E′B - E′Y) and
E′R-Y = 0.877 (E′R - E′Y)
Component analog and digital standards use different scaling factors.
The NTSC system
The NTSC color-television system is a single-channel television concept. Luminance, chrominance and synchronization information are combined to be transmitted in a 6 MHz RF channel originally specified for monochrome transmissions.
The transmission of color takes advantage of the characteristics of monochrome video's spectrum. Essentially, the chrominance information is transmitted in the spectrum “holes” of the monochrome information. As described in the SMPTE 170M standard, the concept uses a wideband (4.2 MHz) luminance signal and two narrowband chrominance color-difference signals of equal bandwidth.
The color-difference signals may be B-Y and R-Y or I and Q, as in the original 1953 specifications of the NTSC system. The bandwidth of each of the color-difference signals may be 600 kHz or 1.3 MHz, depending on where they are used. The wider bandwidth is used within studio environments where there is no significant bandwidth limitation. But transmission and reception constrains the chrominance bandwidth to 600 kHz, and the remaining chrominance bandwidth is wasted.
Each of the scaled color-difference signals modulates a subcarrier. The two subcarriers are identical in frequency but differ in phase. The phase difference between the two sub-carriers is 90°, so the original signals modulating the two carriers can be recovered without crosstalk. The two subcarriers are obtained from a common crystal oscillator. The type of modulation is suppressed-carrier amplitude modulation. It is consequently referred to as suppressed-carrier quadrature amplitude modulation. Because the subcarrier is suppressed, only the sidebands are obtained at the output of the modulators. This results in the complete cancellation of the chrominance signal when no colors are present.
The frequency of the chrominance sub-carrier is an odd multiple of the half horizontal scanning frequency. This results in the interleaving of the luminance and chrominance spectra. The type of spectrum interleaving used in NTSC is called half-line offset. The frequency of the sub-carrier is equal to
fSC = 455fH/2 = 3,579,545 ±10 Hz
This leads to a slightly modified horizontal (15,734.25 Hz instead of the original 15,750 Hz) and vertical (59.94 Hz instead of the original 60 Hz) scanning frequencies. The chosen subcarrier frequency results in a reduced visibility, on a monochrome receiver, of the subcarrier sidebands and a potential 920 kHz beat between the color subcarrier and the audio carrier.
Figure 1 on page 18 shows details of the NTSC frequency-division multiplexing of the luminance and chrominance spectra around the chrominance subcarrier.
Figure 2 shows a simplified block diagram of an NTSC encoder using B-Y and R-Y color difference signals. Green, blue and red signals are fed to a resistive matrix that algebraically combines percentages of these primary color signals to form the luminance (E′Y) signal and the two color-difference signals. Each of the color-difference signals is band-limited before being fed to the respective balanced modulators. A 3.58 MHz subcarrier feeds the B-Y modulator and, through a 90° phase-shift network, the R-Y modulator. The E'Y signal is delayed to compensate for the chrominance delay introduced by the color-difference low-pass filters. The adder combines the luminance, chrominance sidebands, composite (horizontal and vertical) sync and a 180° phase-shifted gated subcarrier burst into a composite color signal.
Figure 3 shows a phase-domain representation of the B-Y subcarrier (0°) and the R-Y subcarrier (+90°). A third subcarrier identifies the synchronizing burst (+180°).
Figure 4 shows a vector representation of the chrominance subcarrier modulation process. A given color, described by a given set of E′B-Y and E′R-Y signal values, is represented by two amplitude-modulated sub-carriers in phase quadrature. The instantaneous values of the two modulated subcarriers result in a vector described by its amplitude and phase angle with respect to the B-Y phase (0°). The vector amplitude represents the color saturation, and its phase angle represents the hue.
Figure 5 shows a 100/7.5/100/7.5 (100 percent) color-bar signal waveform resulting from the addition of luminance and chrominance components. A burst of nine cycles of frequency and phase reference subcarrier is transmitted during the back porch of the horizontal blanking interval. This reference signal is used to assist in the regeneration of the suppressed carrier required for the recovery of the B-Y and R-Y signals. Note that the peak positive signal excursion, for yellow and cyan colors, is 130.8 IRE, which is beyond the overload level of a television transmitter.
Figure 6 shows the relationship between video signal level and percentage of video-carrier modulation. Television transmitter tests are carried out with a reduced amplitude color bar signal known as 75/7.5/75/7.5 (75 percent), whose maximum signal amplitudes do not exceed 100 IRE. Peak-amplitude green, blue and red primary signals will generate composite color signals equivalent to the 100 percent color bar signal. Because there are no highly saturated yellow and cyan colors in nature, the probability of transmitter overload under normal operating conditions is very low. Problems occur, however, with synthetic signal sources, such as character generators and graphic systems, which can create primary signals resulting in excessive-amplitude composite-color signals and lead to transmitter overload.
Video transmitter overload problems will affect not only the transmitted picture quality, but also the sound. The sound is recovered in the television receiver by using the intercarrier beat approach. This approach creates a signal that is frequency-modulated by the audio signal and amplitude-modulated by the video signal, resulting in a modulated beat frequency of 4.5 MHz to recover the audio signal. The 4.5 MHz signal is filtered by a bandpass filter to remove the video-signal component and is treated as a frequency-modulated carrier. It is amplitude-limited to remove video interference and FM-detected to recover the original audio information. Overmodulation of the video transmitter will result in cancellation of the video carrier.
Under extreme circumstances, the derived 4.5 MHz audio carrier is periodically cancelled at the video horizontal and vertical scanning rates. This results in the so-called intercarrier buzz effect. This can be avoided by carefully monitoring and controlling video-signal levels to avoid transmitter overmodulation.
Michael Robin, former engineer with the Canadian Broadcasting Corp.'s engineering headquarters, is an independent broadcast consultant located in Montreal, Canada. He is co-author of Digital Television Fundamentals, published by McGraw-Hill.
Send questions and comments to: firstname.lastname@example.org