The old systems for delivering SD commercials need an injection of modern data management to meet demands for lower error rates and greater efficiency.
Many of the practices at a broadcaster have changed little in the last 50 years or so. There has been evolution, but today's workflows can still be understood if a 1950s engineer were to look them over. This is especially so in that vital area of commercial playout.
There is a long sequence of processes between an advertiser deciding to run a campaign, and a commercial getting played to air and then paid for. The processes essentially split into two chains: a transactional chain, through the agency and sales rep to the traffic department, and the media chain. The latter runs from from the agency to the production company, on to the post house and ad distribution company and finally to the broadcaster. At the end of those two chains, the traffic instruction should marry up with the media file destined for the playout server.
There has been evolution: E-mails and XML files have replaced faxes to carry the traffic data. Media files have replaced videotape. When I first brushed with commercials, they were played down the line as analog video and recorded to tape for transmission. File distribution is now becoming the normal method of delivery, often as MPEG program streams, with separate files carrying metadata.
In master control, the vital step occurs, associating the media with the line in the playlist. As files are ingested, they are linked to the traffic records using identifiers, and the duration, as well as in and out time codes, are checked against the traffic record.
In some countries, unique identifiers are assigned by the agency to maintain the link between media and traffic instruction. In others, an assortment of house numbering systems are used, and numbers can be reused over time. The identifier must be carried in an unbroken chain from agency to broadcaster for it all to work. However, the numbers are frequently rekeyed. A slate may be assembled in the edit suite, for example. Inevitably, errors can occur. The duration of the media file may not match the traffic data. Traffic systems often work to a resolution of one second, whereas the media file may be a few frames over the nearest second.
Of course, there are procedures to catch and correct all the errors that creep in, but inevitably things can go wrong at air time. A spot can get clipped, or the wrong spot gets aired. To get around this, everyone builds in a cost contingency: The post house may need to reslate, and the broadcaster offers a make-good.
That is all well and good, as every business has its “shrinkage.” But television does not exist in isolation, The Internet is a strong competitor for advertising revenue. Media advertising is getting more complex. Ads must be versioned for SD and HD, mobile and tablets, and VOD.
These two factors mean that the processes must be improved to minimize the errors. And that means the use of truly unique identifiers, with automated systems to avoid the need to rekey information. Such systems exist, based on technologies like Web services to link the players. The key to this is the metadata, technical and business. If the metadata about a commercial is assembled in a collaborative manner, by the transactional and the media production chains operating in unison, then when the media file hits the traffic instruction, all should neatly tie up.
One ad will end up in maybe hundreds of media outlets, in a hundred formats, resolutions, codecs, etc. The old systems for delivering SD commercials need an injection of modern data management to meet demands for lower error rates and greater efficiency. Of course, there will be issues with the creatives, to whom a yellow sticky is metadata, but it is essential to compete with the new media world that vies for that advertising revenue.
Send comments to: email@example.com