I’m writing this upon my return from the NAB 2016 show, and you’ll be reading this column right around the time of my return from Las Vegas for InfoComm16.
NAB has evolved so much from its original broadcast-centric self to an “anything goes” show that it’s hard to remember a time when transmitters, antennas and big studio cameras dominated the show floor. Today, we have streaming media, video over IP, cloud storage, and managed software and services as the focus of the show (not to mention startups, apps and the emerging ATSC 3.0 digital broadcast standard).
And then there are the pavilions. Trade shows love setting up pavilions to showcase a hot technology or trend. Sometimes they’re a bit premature: In 1999, the show featured an enormous “streaming media” area in the central hall of the Las Vegas Convention Center stuffed full of startup companies showing postage-stamp-sized video, streaming over DSL and dial-up connections. All of those companies were gone a year later.
We’ve also seen pavilions for 3D (and we all know how that worked out), HDTV (a keeper!) and ATSC Mobile broadcasting (never caught on). So it was no surprise this year that the north hall had a pavilion dedicated to virtual reality (VR) and augmented reality (AR).
Not clear on the difference? VR presents a totally electronic “pseudo” view of the world, which can be represented by custom video clips or generated by computer graphics. AR takes real-world views and overlays text, graphics and other picture elements to “augment” your experience.
Google Glass is a good example of augmented reality: You’d walk down the street and graphics would appear in the near-to-eye display, showing you the location of a restaurant, displaying a text message or alerting you to a phone call. Oculus Rift and Samsung Galaxy Gear are good examples of virtual reality, immersing your eyes and ears in imaginary worlds with large headsets and earphones.
I’ve tried VR and AR systems a few times, and their current state of the art reminds me a lot of the ill-fated companies in the streaming media pavilion from 1999. Yes, the eyewear works, but it’s heavy and quite bulky. And the multichannel spatial audio is also impressive, but I have to strap headphones over those enormous headsets.
The big problem with VR and AR right now is the headset. Galaxy Gear and other systems use your smartphone as a stereo display (you can do the same thing with a simple cardboard viewer), but the resolution of your smartphone’s display simply isn’t fine enough to work in a near-to-eye application.
Those readers who’ve logged a few years in the AV industry will recall the term “screen door effect.” This was the picture artifact created when viewing large images from VGA-resolution LCD projectors of the 1990s. Although 640×480 pixel arrays sounded impressive on paper, when you actually saw them on an 82-inch screen, well, it appeared you were looking through a screen door.
And that’s exactly what I’ve observed with both smartphone-based and dedicated VR near-to-eye displays: Their resolution is just too coarse to watch video for very long. A smartphone with 1920×1080 pixels or even 2550×1536 pixels sounds like overkill when you’re staring at it from 12 inches away, but neither presents enough resolution for VR applications.
We’ve seen some new 4K smartphones, like Sony’s Experia 4 (shown at CES), and we just laugh: Why would anyone stuff so many pixels into such a small screen? Funny how your perspective on that argument changes immediately when you use them in a near-to-eye display system. Yep, even 4K resolution isn’t enough for VR; 8K is probably more appropriate.
Some other issues with AR have to do with spatial disorientation. Even though you’re wearing a headset that’s showing you a real-time view from a special multi-lens camera, the camera’s perspective and your perspective don’t match. It’s quite confusing for your brain to look in one direction and see yourself where you know you’re not expected to be.
This disconnect between eyesight and other cues (hearing, touch) can be quite disconcerting and, in some cases, can induce a mild form of nausea. A marketing manager for a VR/AR manufacturer told me that he could not use his company’s Oculus-based headset and display for that very reason.
After you wear a VR/AR headset for a while and stand up and take it off, you may find your sense of balance is also out of whack, causing you to momentarily have some trouble walking correctly. That’s another example of a spatial disorientation problem caused by the disconnect between eyesight and other senses.
If some of these problems sound familiar, they should. We heard much the same thing during the latest incarnation of 3D from 2008 to 2012, particularly from people wearing active-shutter 3D glasses. During the rollout of 3D, it became apparent that as much as 25% of the general population could not view 3D correctly because of eye disorders, spatial disorientation, incompatibility with contact lenses and other problems.
None of this is to say that VR and AR are passing fads and will soon fade into oblivion. Indeed, there are verticals that have been using VR viewing for some time with moderate success. I recall visiting a large VR simulator more than a decade ago that simulated urban combat and required numerous high-brightness projectors, a special 3D headset, multiple channels of audio and a special tracking surface to walk on. (Also a wall of Linux-based Silicon Graphics workstations in the next room!)
There’s no question that lots of money is being thrown at VR and AR startups. The question is: to what end? An article in the New York Times from last December asked that very question, citing what appeared to be yet-unrealized market demand for VR and AR. And, of course, that raised the inevitable question about whether VR and AR were “the next 3D” and doomed to fail.
Hmm…a fair question! Some casual inquiries of show attendees who had tried on the headsets and headphones in the various VR/AR booths and pavilions revealed a majority who thought it was fun the first time around, but didn’t see any real need for it beyond being a novelty.
It may turn out that the best path to VR is larger, super-high-resolution projected or direct-view images that encompass your field of view, a la IMAX. No special eyewear would be required, and we’ve already heard demos of multi-channel spatial sound (Dolby’s Atmos is a good example).
AR is a different beast, though. How do you add graphics and picture elements to a real-time view of the world without requiring some sort of supplemental eyewear? Google Glass was a start, but is there something better than those ski-goggle-like Oculus Rift headsets? Will we even see corneal implants to provide AR data?
Keep an eye on this market (pun intended). It’s still very much in flux, and I’m sure you saw a few demos of VR and AR at InfoComm. But from my perspective, it’s not gonna take off until we resolve the eyewear issues and increase image resolution by several orders of magnitude.
(Of course, I could just be a grouchy old veteran of the AV industry who just doesn’t “get it”….)