Are you in the market for a new video monitor? Confused by the myriad acronyms, screen sizes and pixel counts, not to mention all of the consumer-centric input and output connections? Is 720p resolution enough, or do you need 2K? What the heck does WUXGA mean? And why would you ever use HDMI connections in a control room?
You’re not alone. The world of display technology has been turned upside down over the past 20 years with flat screens, higher pixel counts and a new class of digital signal interfaces replacing the “plain vanilla” CRT monitor equipped with composite video jacks. Instead of the professional display market driving the consumer market, the tables have been turned, and the “tail” now “wags the dog.”
The surge in sales of flat-screen TVs during that same two-decade interval motivated Japanese manufacturers to build larger fabrication lines (“fabs”) so they could crank up TV production to the tune of millions of LCD and plasma panels per year. Some of these manufacturers (Sony, JVC, Panasonic, Hitachi) had also previously sold broadcast and professional CRT monitors, but in much smaller quantities.
Seeing opportunities, Korean manufacturers joined the fray, and now Taiwan and China have joined the table. This rapid shift over to LCD and plasma technology killed off CRT displays and also created a large wholesale market for panels. It also resulted in some strange alliances to get product to market, such as Sony’s first partnerships with LG Display (Korea) and then Samsung (also Korea) to obtain LCD TV panels that Sony couldn’t make on its own.
The worldwide recession that started in late 2007, coupled with excess LCD and plasma fab capacity and cutthroat competition from Korean and Chinese manufacturers, triggered a downward slide in panel prices that has yet to bottom out. As a result, LCD panels are commodities now, particularly in the smaller sizes that are often used in equipment racks and master control consoles. And plasma technology is on the endangered species list, with sales falling through the floor in the past five years.
How do you differentiate one type of monitor from another? How many companies actually make the LCD panels used in their products? (Answer: Only a few.) With so many companies buying panels on the open market, it’s not unusual to see one manufacturer’s “glass” show up in many NAB booths. Case in point: LG Display’s new 84in 4K (3840 x 2160 pixels) LCD panel is available as a TV from LG, and is also being sold by Sony, Toshiba and JVC.
Know before you buy
Here’s some helpful information you should know before you purchase a new monitor. While almost every commercial video monitor uses LCD technology, you may not know that there are two flavors of LCDs. The first uses a vertical alignment (VA) process, and this refers to how the liquid crystals shift position in operation to block or pass light. Samsung, Sharp and Chi Mei all manufacture VA LCD panels.
The second process is known as in-plane switching (IPS). In an IPS panel, the liquid crystals rotate on their horizontal axis. Panasonic, LG Display and CPT are all examples of IPS panel manufacturers, while Hitachi (which invented IPS) has largely exited the market.
While VA panels are widely used in consumer televisions and tablets, IPS is the preferred technology for professional LCD monitors. One reason is the consistent grayscale images produced by IPS panels when viewed off-axis, as is often the case in an edit suite or control room. All LCD monitors suffer from contrast flattening when viewed off-axis, but a grayscale ramp will reproduce without any color tinting on an IPS LCD monitor.
While VA LCD panels also have excellent dynamic range, they introduce subtle color tinting at different luminance levels when viewed off-axis, a condition that would be unacceptable in a reference or critical monitor. However, VA panels are fine where off-axis viewing isn’t an issue, such as air check monitors in racks and confidence monitors in cameras. Individual monitor spec sheets should clearly indicate which alignment type is being used in each panel.
Another question that pops up: How much resolution do I need? Given that 4:3 monitors have all but disappeared from the pro market, your choices will generally be an HD-resolution format (1280 x 720 or 1920 x 1080 pixels) or a widescreen variation of a computer resolution standard. (Wide XGA 1280 x 800, Wide XGA 1366 x 768 and Wide UXGA 1920 x 1200 are all common.)
Which one should you choose? If your primary application is video monitoring, then stick with either 720 or 1080 pixels of resolution. Note that there aren’t any interlaced monitors — that scanning technique went away with CRTs — so all flat-screen monitors use progressive scan, and the shorthand notation for these is “720p” or “1080p.” You can find 1080p resolution in monitors as small as 15in.
If you need to view computer graphics and video, then you should purchase a workstation monitor. Common resolutions for these products include 1366 x 768, 1440 x 900 (Apple), 1680 x 1050 (Apple again) and 1920 x 1200 pixels (WUXGA). There are even “retina” displays that have resolutions as high as 2550 x 1440 pixels.
While workstation monitors support 720p and 1080p signals, their native aspect ratios are 16:10, not 16:9. That means any video content you view will present with black bars top and bottom when sized correctly. You’ll find more connector options on workstation monitors, but they will be more consumer-centric (HDMI, DisplayPort, VGA) and won’t support SDI or HD-SDI connections.
Conversely, 720p and 1080p video monitors will offer little support for computer graphics resolutions. In fact, many of them limit their support to the 27-year-old eXtended Graphics Array (XGA) standard. However, these monitors will provide HD-SDI interfaces, along with a handful of analog inputs (composite and component video) and probably an HDMI connection for interfacing camcorders, Blu-ray players, STBs and other prosumer gear.