With the need to distribute content over many channels to numerous consumption devices, intelligently engineered parallel-production workflows are imperative to cost effectively keep up with distribution demand. You could hire more staff and build a brute force, linear infrastructure. This will waste money and time, and increase maintenance and stress.
Efficient assembly of program elements for multi-path production requires a workflow where common elements are created once, resized and then mixed for platform specific distribution and consumption.
For example, a logo for HD presentation remains at full size when mixed with HD content. If this composited HD video were downsized for a cell phone, the logo would be unintelligible. So how do you solve this problem?
A variety of subsystems must be harmoniously integrated into a parallel workflow and infrastructure that is used in assembling discreet audio, video and graphics elements into programs. The primary systems include:
- Distribution and routing: Audio, video and GFX are distributed in either an uncompressed, compressed or GFX format.
- Storage and indexing: MAM systems consisting of large amounts of disk or tape storage, and an application for indexing and locating content.
- Production control room: The heart of the assembly operation. Especially for live events where GFX elements are inserted and removed, cuts to various cameras are ordered, talent is cued and organized chaos prevails.
- Master control room: Processes that insert and remove program segments and elements without operator intervention. Automation and commercial insertion.
Broadly speaking, assembly scenarios can classified as: live versus canned (tape or file) and remote versus studio. Table 1 list some characteristics of each scenario.
| Remote |
--Pass through, i.e. Network feed
| Studio |
|--Dynamic environment, breaking news |
--GFX, must be built on the fly
--PCR & MCR
--usually fixed timeline
--Automation controlled, no op needed
Table 1. Characteristics of assembly scenarios
Distribution and routing
In the “good old analog days” the choice was simple: composite or component. The difficulty then was in timing component signal distribution by precisely trimming five cable lengths. Using composite video for distribution would not maintain broadcast quality through the production chain.
Baseband distribution consists of real-time, continuous signals. NTSC has been augmented by numerous formats that are used to move content around the facility. Digital formats such as SDI and AES have added some complexity, but reduced five cables to one and maintain optimal audio and quality. HD-SDI and Dolby E are other examples of formats commonly used in a BOC for essence distribution during the assembly process.
Although content will be assembled in the uncompressed domain, some if not all of it has travel the infrastructure as compressed essence. This complicates the established SDI real-time distribution infrastructure by adding media network, compressed files, computers and applications. Compressed essence, file transfers over a media network comprised of switches, routers and storage can attain faster-than-real-time performance and reduce time in the production process.
Generations of compression and decompression are a source of artifacts. Selection of an appropriate high bit-rate format is essential for multiplatform assembly, format conversions and efficient MAM storage.
Can you find the stuff and then read it?
With content and program elements being stored less on tape and more on a disk as files, finding content quickly has become more adventuresome than ever. The content that was once on a DVD and horded by one of your colleagues, now is buried in a MAM. Your only key to finding it is metadata. And if your metadata is insufficient or inaccurate, you can’t find it, therefore, it doesn’t exist!
Many formats are used for field-acquisition. HDV, DVCPRO HD and others all must be transcoded at some time in the assembly process. Coding bit rates make matters worse. SD MPEG-2 at 40Mb/s is not the same as MPEG-2 at 15Mb/s. All of these variables can have a large, visible impact on video quality.
In the news business, time to air is critical to beat competing broadcasters and enjoy the prestige of breaking a story first. Automated transcoding is necessary because of the variety of video and GFX formats in use. Folder drop transparent, transcodes are becoming the solution to workflow efficiency.
Multiplatform parallel production
Consider a circumstance where HD content and GFX are to be distributed over the air, via the Internet and to cell phones. What is the most efficient parallel workflow to accomplish this goal?
For over the air, a typical PCR to MCR workflow and process steps will get the job done. One option would to down convert this program stream to Internet and cell phone formats. But will all the information you want to convey be intelligible?
For Internet delivery, the HD content is downconverted to an appropriate window size. In parallel, the GFX information can be sent in a separate window for display on the PC. This has the added advantage of allowing the viewer to determine what is more important to them: video or GFX text information.
The same content may be mixed in a similar fashion for cell phone consumption. The video will be downconverted. You can’t do much with a 2.5in display. You want to be sure consumers can read the text. One approach might be to alert the viewer to the availability of additional information. They then click on an icon and the screen now displays the lower third text information appropriately formatted.
Some broadcasters are trying new workflows, processes and equipment to simplify the parallel assembly workflow.
- HD letterboxed for SD. The entire aspect ratio 16/9 HD versus 4/3 SD is an issue that can be avoided. PBS and other networks have been downconverting HD to letterbox 16/9 on a 4/3 SD monitor. With this approach framing, composition and GFXs need not be compromised.
- Centralized control rooms. The day is coming when remote camera feeds will be backhauled to the central facility. Graphics and other show elements will be switched in the PCR.
- Compressed domain mixing. In the Feb. 2, 2005 issue of HD Technology Update, OmniBus Systems Vice President of Technology John Wadle discussed his vision of a simplified infrastructure for SD/HD coexistence and introduced the concept of a mix server. With this approach, rather than production switching SDI, a mix server performs this operation with file-based content. Will this concept be expanded in the future to include mixing for other formats and delivery channels?
- Networked baseband routing. Similar to the mix server is the concept of a networked baseband router. By adding a network interface and transcoding engines to a traditional baseband router, ingest and format conversions become transparent and occur while moving content around the BOC. An example of this is Pro-Bel’s FUSION that routes audio from both AES and file sources.
What tack to take?
The next series of T2D articles will examine some of these assembly issues and topics. It is important to note that answers are emerging, and often there is more than one answer to a given production workflow requirement. Digital broadcasting has ushered in an environment that is more like research and advanced development than a production assembly line using tried and true equipment and techniques. This is an exciting time, and sound engineering practice will play a major role in separating the winners and losers in the mad scramble for a piece of the multiplatform pie.
Harris has an informative white paper that describes the “As-Is” and Future multiplatform infrastructure: http://download.harris.com/app/public_download.asp?fid=936