Rather than switching to lights out, unmanned operations, broadcast stations still require an engineer.
I'll dispense with rhetorical questions like, “Whatever happened to manual master control?” The plain answer is nothing. We seem to have a perception that automation has replaced humans. To a degree it is true. In today's multichannel world, more channels cannot equate to more operators. The economics of broadcasting just won't permit the ever higher cost that additional labor will always bring. This has thrust automation into the mainstream of broadcast technology for good. But it is important to note that in many, perhaps the vast majority of, installations, there still is an operator present. The key fact is that what they do is different.
The need for human intervention
Today, a multichannel master control may not have “moving parts,” i.e. videotape, or if I dare date myself, certainly not the film chains used for much of the broadcast content in the early decades of television broadcasting. We don't get spots on videotape much anymore either, because they are often delivered electronically to an IT-based server in the station, which then either moves the content (essence and metadata) to the air server via FTP, or perhaps plays it out for ingest in the server. In virtually every station, spots are stored on a server, even if long-form content is still played from videotape. At some level, automation is required to play back a server, essentially without exception. There are no control panels for button pushers to interact with. Though you could technically plug in an RS-422 control panel to almost any server and get a clip (spot) to play, the likelihood is that either a full-blown automation system, or a simple playlist manager, is used to cue up spots and play them to air.
It is fair to ask why many stations have not gone to lights out, unmanned operations. For the most part, exceptions and late additions to the schedule make unattended operation risky, or perhaps less than a career-enhancing decision. Content must be ingested and trimmed before it is used for air. In an era where file-based workflow is becoming mainstream, one might conjecture that passing metadata directly to automation and traffic will take away a primary function of the MCR operator. But someone still needs to take responsibility to make sure the content has been reviewed and noted as correct and ready for air. Consider for a moment an error in the file domain, which might render content at best embarrassing.
For example, let's say the content is properly marked, and audio and video are sampled for automatic QC and proved to be worthy technically of your valuable airtime. But what if one thing was wrong and could not be identified without a human's intervention? For instance, what if audio is present, but it is Spanish instead of English? Or perhaps it is the wrong “Sesame Street” episode with metadata that doesn't match? Only the human in MCR, or someone else in the station, can spot the error and even then only if they play the content to a monitor and listen to the audio.
I recognize that this might seem a bit contrived, but as we move further into file-based workflow, such issues beg for human solutions. One of my clients is actively considering what might seem counterintuitive — eliminating master control as a department and moving it into the traffic department. This has considerable business appeal. The schedule starts in traffic, and in this era, MCR can be run entirely from (or even on) IT hardware, eliminating the need for a different type of technical space for MCR. This is not just window dressing, because who can better judge if a spot is the right one? Indeed who can gauge a program to be sure it matches the log?
Merging traffic and automation
This raises many questions that are not so obvious. Is this combining traffic with MCR or creating a more natural broadcast operations department? It is arguable that from a business perspective, this is a potential way to improve operations. If a log problem is spotted, it can be corrected by the people best equipped to understand the ramifications. Conversely, with technically savvy operators as part of one cohesive team, doesn't the ability to prevent potential problems get enhanced?
But the most intriguing aspect is that perhaps we are approaching a time when the software itself should be “merging.” One manufacturer, which sells both traffic and automation software, has a platform with tight links between traffic and automation modules. This makes great sense. I like to think of traffic as building a template and automation as a step that adds macros to make it run automatically. Automation is the machine control engine, and traffic is the planning tool. Link them tightly, and you get much more power. It would be elegant if we could make automation calls from traffic software and traffic calls from automation software.
This holy grail is not a new concept. In a burst of creative engineering and understanding of the business imperatives faced by both vendors and broadcasters, SMPTE started a development project several years ago that went a long way toward tightly coupling automation and traffic. The BXF standard, first published in 2008, and now “BXF 2.0,” provides communications “API” to which both traffic and automation vendors can build products that allow tighter integration and interoperation between pairs of companies without reinventing the wheel every time a user asks for a different vendor combination, or even set of supported options. BXF 2.0 will extend the solid work of the BXF standard (SMPTE 2021).
It is important to note that BXF allows defined interconnections among many broadcast business applications, including program management, traffic, automation and content distribution. BXF 2.0 will extend the metadata mapping to include how metadata transported by BXF and essence transported in an MXF wrapper can be more tightly coupled, improving the tight coupling that file-based workflow really needs.
One final thought. The right video/wrong audio example I mentioned may be prevented in the future using “hashes,” or fingerprints, which uniquely identify content and substantiate that it has not been corrupted. The Advanced Media Workflow Association (AMWA) is considering an Application Specification that would provide an end-to-end system for content verification.
John Luff is a broadcast technology consultant.
Send questions and comments to: firstname.lastname@example.org