Ever since the launch of high-definition television in the 1990s, it seems as if some new “bell and whistle” enhancement comes along every few years. First it was the changeover to flatscreen plasma displays in the late 1990s, followed by a shift to 1080p and Wide UXGA resolution in the early 2000s.
The industry transitioned to liquid-crystal display (LCD) panels for TVs and monitors a few years later. UHD (4K) imaging popped into our consciousness in 2012. And, of course, 3D made its “once every few sunspot cycles” appearance in 2009, followed by virtual reality last year.
Some of these trends actually stuck, like 4K. Display manufacturers are slowing down production of Full HD (1920×1080) display panels in favor of UHD (3840×2160) as consumers increasingly choose the higher resolution. That, in turn, means that the displays we select for digital signage, classrooms, meeting rooms and other applications will also be of the 4K variety.
The latest trend to rear its head is high dynamic range (HDR), which is accompanied by wide color gamut (WCG) imaging. In a nutshell, HDR means a greatly expanded range of tonal values that can be shown in still and moving images. Conventional cameras and imaging systems can capture anywhere from nine to 11 f-stops of light (each f-stop increase represents a luminance value twice as bright as the previous one).
HDR takes that to a higher level by capturing as many as 22 f-stops of light, and reproducing those tonal values becomes a real challenge to displays that employ conventional backlight or illumination systems. Hence, we are now seeing a new crop of LCD TVs with turbocharged backlights to reproduce the extreme dynamic ranges of HDR images. On the emissive display side, organic light-emitting diode (OLED) TVs can also reproduce HDR content, although with lower peak brightness levels.
For some perspective, the venerable CRT display had a peak brightness level somewhere around 29 footlamberts (100 candelas per square meter), which represented close to 100% diffuse white. In an HDR display, that value largely holds, but more intense specular highlights (like the sun reflecting off a pane of glass or water, or a bright streetlight at nighttime) can hit peaks much, much higher in the thousands of cd/m2.
And HDR isn’t just about extreme brightness. The entire grayscale is expanded, so we should see more shadow details along with intense specular light sources. When done correctly, HDR images are quite the departure from everyday HDTV, and more closely resemble the range of tonal values our eyes can register, with their visual contrast ratio approaching 1,000,000:1.
There are numerous ways to achieve higher levels of brightness. Dense arrays of light-emitting diodes can do it when used in a direct-illumination architecture. However, the favored approach is to employ a special optical film embedded with nano-sized red and green quantum dot particles, stimulated by an array of blue LEDs. TV models using this approach in 2017 can achieve peak small-area brightness values of 2000 cd/m2.
For perspective, consider that an LED (emissive) videowall for indoor use will routinely hit 3000 cd/m2 brightness with full white images, and you can appreciate just how much of a leap HDR represents over current imaging technology. What’s more significant is how quickly the prices for HDR displays are coming down, particularly as Chinese TV manufacturers enter the marketplace.
Just prior to the Super Bowl (the best time to score a deal on a new TV, by the way) it was possible to purchase a 55-inch “smart” Ultra HDTV for just $499 from a Tier 1 manufacturer. And a 65-inch model with basic HDR (static metadata) could be had from a Chinese brand for less than $700, while a Tier 1 HDR model of the same screen size was less than $900.
I mentioned wide color gamut earlier. It stands to reason that, if a camera can capture a much wider range of luminance values, it can also record a much wider range of color shades. And that’s exactly what winds up happening. With the current 8-bit color system widely in use for everything from broadcast and cable television to Blu-ray discs and streaming media, a total of 16.7 million colors can be represented.
With HDR and WCG, the playing field is expanded considerably and now requires 10 bits per color pixel, resulting in 1,073,741,800 colors: more than 1 billion color shades! That’s too much heavy lifting for LCD displays that use white LEDs with color filters, but it’s within reach of quantum dot LCDs and OLEDs.
The availability of HDR/WCG content has also forced a speed upgrade to display interfaces. HDMI 1.3/1.4 simply can’t handle a 4K HDR signal, so we must use HDMI 2.0 to do the job. And even version 2.0 is barely fast enough: If the 4K video signal uses lower color resolution (4:2:0, 4:2:2), it can transport HDR signals as fast as 60Hz. But switch to RGB (4:4:4) color mode, such as we’d see with 4K video from a computer video card, and HDMI 2.0 can’t pass a 60Hz signal with anything more than 8-bit color.
On the DisplayPort side, things are somewhat more accommodating. Version 1.2 (the current one) can pass a 3840x2160p/60 signal 10-bit RGB (4:4:4) color, but nothing more. The newest DP version, 1.3, raises its maximum speed to 32.4Gb/s, which makes imaging 12-bit and even 16-bit 4K HDR content possible. However, version 1.4 is required to recognize the HDR “flags” that travel with the content and must be passed on to the display. (HDMI uses extensions for HDR and WCG, with “a” used for static HDR metadata and “b” used for dynamic metadata.)
Marketing folks have a field day confusing people with new display tech and, apparently, they’re going to town with HDR. We’re now hearing about “HDR-compatible” products, particularly in signal interfacing. Nothing to see here, folks: If the signal distribution and switching equipment is fast enough to pass the required clock rate and hands over HDR metadata (CTA.861.3) to the display without alteration, then it is, indeed, “HDR compatible.” Simple as that.
I return to my original question: Will HDR have an impact on our industry? The answer is an emphatic “yes!” There are many customers that would realize a benefit from HDR imaging: Medical, surveillance, military, research, virtual reality and simulation verticals will embrace it pretty quickly, and others will follow.
Unlike 3D, HDR is here to stay. I’ll have more to say about it in upcoming columns after I walk the NAB show floor this month. Stay tuned!