With computers in most broadcast and video devices, one might expect a MCR to just be a room filled with PCs on desktops. While today’s MCR has plenty of keyboards and LCD monitors, most of the computing power is hidden behind panels in chassis, cards and in racks located perhaps far from their respective control points.
While virtually all of today’s broadcast gear uses computers, let’s use a more accurate term — microprocessor — to indicate that a computer is used within the system or product. Nowadays, even the lowly intercom system will use a microprocessor to move communications around a broadcast facility. Given that most equipment relies on these devices, locate them in a central place with easy access to content production areas.
Racks should be arranged in what’s called a “hot aisle, cold aisle” configuration. The backs of one rack row face the back of an adjacent row of racks. This is called hot aisle.
Then, the front of one row of racks faces the front of another row of racks. This becomes the cold aisle. This is an unusual configuration for broadcast and video facilities. However, data centers use it because of the cooling efficiency it affords. The segregation technique helps minimize the mixing of hot and cold air. Combined with the use of blank panels to separate chasses, air dams to seal the top, bottom and sides of equipment and brush grommets, which seal floor outlets, this technique creates an efficient and cost-effective way to control airflow.
There are advanced techniques, one of which uses hanging plastic strips resembling doors often used on cold storage areas, but broadcast centers seldom need such elements. In all cases, the desire is to utilize as much of the facility’s computer room air-conditioning (CRAC) as possible.
The fact is, much of a broadcast center’s processing power is located in what’s called the machine room. Even though the control room is the showcase, complete with walls filled with video monitors, the real brains behind the operation is the machine room. Here, dozens to hundreds of devices rely on microprocessors to make decisions, move content, create graphics and play out programming. The key point to recognize here is that these rooms often more resemble a modern data center than the traditional old-style broadcast center.
Photo: Typical data center showing hot-aisle/cold-aisle installationWhile machine rooms do contain more than just computers, for our purposes of discussion, we are going to focus primarily on digital products located there. Servers, routers, switches and other IP-centric data processing devices occupy an increasing amount of a machine room’s space.
Even more important, this equipment may draw more power and create more heat than the rest of a facility's traditional video equipment combined. Today’s machine rooms are starting to resemble wall-to-wall servers. Because of these changes, engineers need to rethink their design elements when building these new spaces. This series of articles will help a qualified engineer work through the calculations and design steps that can effectively support this increased digital load.
Shown to the left is but a portion of the graphics room for the 2008 Summer Olympics in Beijing, China. The equipment visible is an Omneon MediaGrid server, which was connected to an Omneon ProCast CDN transport system. Together, the network created a 6000mi, fully interactive, edit by-proxy content creation system. Much of NBC’s Olympic coverage passed through this room during production and broadcast. Yet, there was not one video monitor, tape machine or cassette in this space. It was truly a digital space. As such, this space had unique power and cooling requirements.
As broadcasters increasingly rely more on microprocessor-based equipment, including IP-centric products, the need for proper power distribution, air-conditioning and maintenance require engineers to reexamine how rack rooms are designed, powered and cooled. In addition, as video/audio storage needs continually increase, and despite larger hard drives in smaller packages, there never seems to be enough racks, power or cooling to support the broadcast center.
While engineers make choices and trade-offs early in the design process, those early decisions affect the life of a facility. A building constructed for only today’s needs will shortly become obsolete. However, trying to plan for all eventualities will quickly bust anyone’s budget. Understanding the proper balance between the two is important, and these articles will help you make those decisions.
Keeping your cool
A facility’s cooling system often represents the first or second largest part of the electric bill. Factors such as equipment age and a facility’s geographic location are also key. However, in every case, it is extremely important to maximize the cooling system’s efficiency. As you will see, bigger isn’t always better when it comes to sizing a facility’s cooling system.
When it comes to machine rooms and control spaces, the old practice of making the human spaces as cold as the occupants can stand and believing that practice will result in sufficient equipment cooling no longer works. Gone are the days where the equipment was hot and yet the staff had to wear sweaters because the ambient room temperature was too cold.
There are several reasons this design criteria no longer works, but primarily it’s because much of the heat-generating equipment is now concentrated into one space — the machine room. Instead of having large, heat-producing gear spaced throughout a large master control area, engineers now locate the digital gear in racks, usually walled off in some adjacent space. In addition, today’s digital devices are increasingly smaller, which results in much higher rack density. Let’s look at some basic design elements.
In the next part of this series, we’ll examine a seven-part design philosophy to keep your machine room and MCR properly cooled and powered.