Dear Mr. Robin:
I work for Jordan Radio and Television and have a question about digital compression from one of your earlier articles in Broadcast Engineering. Is DCT process considered a lossless technique?
Also, I have a subscription for Broadcast Engineering and I read your articles, but unfortunately, I missed some of them. Could you please help me get back copies?
Eng Saleh Zubi
Jordan Radio and Television
Michael Robin responds:
The DCT is essentially a lossless mathematical process. The zigzag readout of the 8x8 block is also a lossless process. The requantizer (REQ), following the zigzag scanning of the 8x8 block, is the lossy process. Essentially the high frequency coefficients are more coarsely coded than the low frequency coefficients. They are divided by a value “n”>1, and the result is rounded to the nearest integer. “n” varies according to the position of the coefficients in the block with higher frequencies attracting larger values. The result is that low frequencies generated by large uniform areas of the picture, where the eye is most sensitive, are finely quantized. Higher frequencies generated by fine picture details, where the eye is less sensitive, are more coarsely quantized since distortions and accompanying quantizing noise are less visible.
The requantizer is followed by the variable length coder (VLC), which assigns more bits to rarely occurring values and fewer bits to frequently occurring values. The VLC is followed by the run length coder (RLC), which sends a unique code in place of a long string of zeros and thus helps reduce the bit rate. All these processes result in a variable bit rate (VBR) and a subjectively constant picture quality (CPQ).
Digital distribution networks require constant bit rates (CBR). A buffer helps obtain CBR but results in a variable picture quality (VPQ). Depending on its overflow or underflow status, the buffer sends a message to the REQ controlling the number of bits per sample. The situation is further complicated when several compressed digital television signals are time division multiplexed and share a common carrier. Here a further bit-rate control, called statistical multiplexing, is used to individually control each REQ.
It is this long chain of events that introduces losses. As long as these losses are imperceptible to the eye everything seems to work well. However, the original signal cannot be exactly reconstructed.
Editor's note: Copies of most articles from Broadcast Engineering are available on our Web site, www.broadcastengineering.com. A two-year archive of past articles is also accessible there.
Dear Paul McGoldrick:
I believe you have a clear understanding of what it will take to accelerate the adoption of HDTV. I would take issue with you on the matter of denial of services to legacy viewers, however. DVDs, as you have correctly pointed out, offer additional venues for entertainment, above that which is available from VHS. Wouldn't it make sense for program providers to take a hint from this?
Suppose that background material were available on athletes during the Olympics via a PIP delivered in spare bandwidth? Suppose local merchants who sold sponsored products could be highlighted via the same means? Perhaps some of this information would have to be downloaded to a hard drive in the consumer's television or set-top box to achieve full enhancement, and it may be that a back channel would be needed for certain applications. But many, many intriguing items could be delivered as simple one-way program enhancements, many of them offering added value to advertisers, without a back channel, and without local storage. None of this could be delivered via NTSC.
This is truly a case where, if we build it, they will come. We just have to build something people will spend money on to have in their lives.
Tom Norman, CPBE
Harris Broadcast Systems Division
Send questions and comments to: email@example.com
Do you have a comment about this article? To tell us your thoughts, click here.