IT technology is dominating broadcast.
Here we are in the second year of the second decade of the 21st century, and we are still discussing the transition between baseband and IT-based systems. Baseband digital systems are in fact only marginally different from analog video and audio, which have been with us for generations, since long before the turn of the century (20th to 21st that is). Let me explain, because this context is important to understand the magnitude of the change to IT-based systems, or hybrid systems embracing both.
Baseband analog and digital
Baseband analog and digital systems are unidirectional and deterministic in timing. Every 1/30th (or 1/25th) of a second another frame of video traverses every baseband interface. Signals occupy the full circuit between one device and another. Digital is no different in this respect from analog. Similarly, any discontinuity in the signal, be it a broken connection or a defective circuit, means the content is lost, unlike most IT-based systems that will pick up where they left off when the circuit was interrupted. The nature of time is inviolable. In fact, it is precise down to the nanosecond. One or two degrees of subcarrier in error in an analog system would be noticeable to a consumer viewing the affected content. In digital systems, built-in buffers ease the problem, but interruptions still lose content.
Switching between signals requires precision as well, with switches required to be locked to the same clock as the content or discontinuities cannot be avoided. Synchronization of audio and video is done by physical means because the signals generally are not referenced to a common time stamp.
Most importantly, a digital baseband signal is simply a sampled representation of the analog original signal. All of its characteristics as content are faithfully reproduced, though the transmission may be more robust and less prone to degradation in transmission.
Integrating baseband with IT
Integrating these two well understood signal types with IT-based systems requires good understanding of what IT systems require, which in many ways is not at all like baseband systems. First, and most obvious, IT systems are bidirectional and networked. That is to say that a “circuit” becomes virtual because more than one signal can share the same physical interface. Bits from two video signals, or even a database replication and a slew of audio and video essence, might live on the same “wire” at the same time and might be flowing in both directions. In addition, the network contains its own command and monitoring system in the same packets that carry the content.
Content is also not delivered on a network in an IT system as a precisely timed isochronous set of signals, though recent standards can emulate the effect of isochronicity across a network for media content. Rather it is a best effort approach that statistically delivers 25fps or 30fps. If a system has a sufficient buffer built-in, there's no challenge, but it does lead to a completely different approach to “switching” signals. It is often the case that the receiving application requests that different content be sent across a connection instead of selecting at the receiving end in a device doing processing.
In general, special-purpose devices don't do the processing in an IT network, though there are some exceptions. Most often, general-purpose computing systems are programmed to manipulate video and audio essence and metadata. Some compute-intensive processes — or when speed is critical, like image scaling, compression and decompression — may be done with special hardware but not by converting to baseband. Rather, they may be done by delivering the essence to the processing subsystem from a compute engine, which receives the result back after processing the content to be directed to the next step in the workflow.
Thus, the insides of an IT processing system are inherently different and fundamentally incompatible with the approaches that baseband systems require. This gives rise to some interesting changes in our industry. For instance, SMPTE, which used to referee “discussions” about competing hardware video and audio standards and the electrical interfaces between them, now has many interface standards that are entirely software. The group is now working on standards that define how to carry essence on IT network platforms without blocking or latency that plagues many attempts at sending uncompressed media over IP networks.
So the issue becomes: How do we knit together IT and baseband in this decade of transition? The first step was islands of IT processing in a baseband world. We are rapidly moving to islands of baseband in an IT world. We also have systems that live across the boundary at all times. For instance, a video switcher has many IT interfaces for live feeds, but IT interfaces for delivery of file-based content for still stores and clip players within the switcher. Increasingly, we see devices that do their processing using compute engines, but provide at least some baseband interfaces. An example is a format converter, with entirely compute-intensive electronics inside, including special-purpose GPUs for scaling and other computations. The same processes could be ported to a server without baseband interfaces, perhaps operating only on files or streams of content delivered over IP.
We all have much to learn about this intersection of two disparate technology sectors. A lot of great engineering has been done to bridge the gap and allow hybrid broadcast/IT systems to achieve much of the complicated workflow we expect today. But we continually expect more, and Moore's Law predicts the inexorable decline in the cost of computing and the increase in power to be used. The net result is an expectation that eventually IT will subsume baseband techniques. Perhaps it will, but the processing power in a video switcher has been achieved with an extremely high degree of reliability, and freedom from bugs and reboots (mostly). I can't help but squirm every time someone predicts that the end of broadcast technology is upon us and the era of IT predominance has started. I also can't deny it may be true.
John Luff is a broadcast technology consultant.
Send questions and comments to: email@example.com