We’re halfway through 2016 (and more than halfway through the decade), and it’s just mind-boggling at times how much display technology has evolved and changed since 2010…and even since January.
Six years ago, we were still debating the merits of liquid crystal display (LCD) imaging versus plasma in monitors and consumer TVs. Today? Plasma is dead. Buried. A footnote from the past. And LCD technology has captured more than 99% of the worldwide television market.
In 2012, the first Ultra HD (3840×2160 pixels) TVs made an appearance on our shores. They were big (84 inches) and expensive. Not quite four years later, we’ve seen a remarkable drop in prices as 4K becomes mainstream in the home, and soon, in the commercial AV space. How about a 55-inch 4K set with support for high dynamic range that costs all of $700? It’s here, thanks to Hisense.
Guess what? If you’re just starting to become aware of 4K, you’ve already missed the train. Now, we’re seeing high dynamic range (HDR), wider color gamuts and fast frame rates coming to displays. And you can finally buy organic light-emitting diode (OLED) TVs (curved and flat) with 4K resolution and high dynamic range support.
Oh, wait! Did I forget to mention that there’s a new HDMI standard (version 2.0a) for supporting HDR? And a souped-up version of DisplayPort (v1.4)? And a way to compress display interface signals (Display Stream Compression)? And ways to transmit 4K content over 60GHz wireless connections to monitors and TVs?
I can just see you reaching for a bottle of Advil right now. How is it possible for technology to change this rapidly and get to market so quickly? How can anyone possibly keep up with all these changes? (Did I mention you can now buy 5K computer monitors? And at least one company will have an 8K TV for sale by the end of this year? NOOOOOOOO!!!!!)
Yep, it’s absolutely insane how fast everything is changing in the world of display tech. So I won’t try to give you a detailed update on where everything stands as of late May, when I’m writing this column. (And guess what’s taking place this week? The Society for Information Display’s DisplayWeek trade show and technical exhibits. Great… more advances in display tech to write about. Groan….)
For this month, I’m going to take a deeper dive into a fascinating new way to “illuminate” a display: quantum dots (QD). (Not to be confused with Dippin’ Dots, which you can find at any ballpark and carnival. And which taste a whole lot better than quantum dots!)
I’ve mentioned QDs before in my columns. These super-tiny microcrystalline structures are compounds of metals like cadmium and indium, bonded to selenium and phosphorous. When you bombard these “dots” with high-energy photons, they transform that energy into a different form and emit photons of their own; hence, the quantum effect.
If you need an analogous product, think of the phosphors in CRT displays and plasma displays. Both required an intense source of energy such as high voltages to “light up” and create color.
The size of the quantum dot determines the wavelength of the color emitted, and that color is spectrally pure with a bandwidth from 25 to 40 nanometers. Even better, the luminous intensity of the emitted color increases as the stimulus energy increases, something that’s always been a problem for phosphor-based and OLED-based displays, where certain colors like dark blue tend to fade out over time when driven hard.
The first quantum dot particles were discovered in 1981 and were officially named as such in 1988. Quite a few companies have been playing around with quantum dots for more than a decade, but there was no mass-market application for them until recently: the previously mentioned high dynamic range imaging technique.
Conventional LCD displays use an array of white light-emitting diodes in either an edge configuration with optical waveguide, or in a direct-illumination architecture (also known as “full array” backlighting).
Considering that the imaging stack in LCD panels blocks all but about 5% to 7% of the backlight, it’s pretty amazing that an LCD TV can pump out 300 and even 400cd/m2 of light with a full white image. As bright as that may seem, it’s not enough to faithfully reproduce a high dynamic range signal that has captured 15 to 20 stops of light.
Enter the dot! Enterprising engineers at companies like Samsung, Sony, TCL, Hisense and even Dolby have created illumination systems using arrays of blue light-emitting diodes (for both the color blue, and for photon energy) and red and green quantum dots. Adding all of that horsepower to the display doubles the full white brightness >800cd/m2, with small-area 100% white patterns measuring 1000cd/m2.
Think of it: Not that long ago, a CRT display with a full white brightness measurement of 100cd/m2 (29 foot-Lamberts) was considered bright. But a $1200 Samsung HDR LCD TV will crank that up by a factor of 10. And Dolby has shown a reference-grade HDR LCD monitor with a turbocharged QD backlight that can hit 4000cd/m2. That’s comparable to the brightness of an outdoor LED display you might see in Times Square, or at a football game. (Keep all combustible materials a safe distance away from this screen!)
Not only are there numerous companies manufacturing quantum dots, there are also different approaches to installing them in the light path. QD Vision of Boston has been manufacturing QD “light pipes” for edge-illumination architecture in super-thin LCD TVs. Nanosys, based in Sunnyvale CA, has partnered with 3M to manufacture a quantum dot enhancement film that sits in the LCD panel stack. And Dow Chemical recently opened a factory in Korea to supply quantum dots to TV makers.
There have even been some attempts to build purely emissive displays using quantum dots as the imaging pixel. At one time, QD Vision showed discrete red, green and blue “Q-LEDs” that the company said could be built into an emissive display that would run circles around OLEDs. (It would be plenty expensive, too.)
Here’s another cool thing about quantum dots (besides the fact that they don’t melt like Dippin’ Dots): They’re quite energy-efficient. At the 2016 ICES, QD Vision exhibited three 55-inch Ultra HDTVs side-by-side (an LCD set using white LEDs, an OLED model and an LCD TV equipped with quantum dots) set to the same brightness level and showing the same content.
The white LED model and the OLED drew about the same power on one series of images, around 170 to 180 watts. But the quantum-dot-equipped model was drawing just 113 watts of electricity. That’s a story we need to hear more about, especially when you consider that 50-inch plasma TVs from a decade ago drew more than 600 watts of electricity!
Now, the catch (and there’s always a catch): Quantum dot formulations can contain potentially toxic metals. One common formulation used for green is cadmium selenide (CdSe), and, although it produces a dazzlingly bright, saturated green, the European Union’s Restriction of Hazardous Substances (RoHS) lists cadmium as a no-no. Selenium’s not so popular, either.
Another formulation, one used by Samsung in its QD-equipped TVs, is indium phosphide (InP). Apparently, it is not considered to be a toxic compound, although indium is definitely a metal. By using InP quantum dots, Samsung can claim its displays are “green,” i.e., environmentally friendly. (Indium tin oxide, or ITO, is commonly used for electrodes in displays, too.)
At SID (via my colleague Chris Chinnock at Insight Media), Nanosys showed a display that used indium phosphide to generate red and a new formulation of cadmium selenide for green that meets the RoHS limit of 100 parts per million. Hitachi Chemical will become a manufacturing partner with Nanosys for its quantum dot enhancement film, with the goal of driving down prices for QD-equipped LCD TVs.
Summing up: Quantum dots are here to stay and a timely solution to the challenges of displaying HDR content and wide color gamuts on LCD TVs. But it’s still early in the game and we haven’t begun to see the capabilities of these intriguing microcrystalline compounds.
Coming to your next TV? It’s pretty likely.…