The dream of holographic video has long been a staple of science fiction—the image of Princess Leia beamed from R2-D2 in Star Wars, the holodeck in Star Trek: The Next Generation, and the shark that pops out at Marty McFly in Back to the Future II are just three examples. Well, that fantasy is now poised to become reality, but without breaking the laws of physics. A Silicon Valley startup called Light Field Lab has developed the world’s first truly holographic digital-display technology, which I was privileged to see for myself during a visit to the company’s offices this week.
Before I tell you what it is, let me tell you what it isn’t. It’s not the technology used to resurrect Tupac Shakur on the concert stage at Coachella in 2012; that’s a 160-year-old effect called Pepper’s Ghost that literally uses smoke and mirrors to reflect 2D floating images. And it’s certainly not autostereoscopic 3D, which companies such as Samsung, Sony, Looking Glass, and Stream TV have demonstrated for years. That technology presents separate 2D images to each eye—as long as you’re in the right spot—and sometimes causes dizziness and nausea due to something called vergence-accommodation conflict, which can also plague glasses-based stereoscopic images and head-mounted displays.
Founded in 2017, Light Field Lab has developed a technology it calls SolidLight, which replicates exactly how light behaves in the real world. When you look at an object in the real world, light from the sun or other source reflects from the object at many different points in many different directions, some of which enters your eyes (see Fig. 1a). As you move around the object, different light rays—or, more properly, wavefronts—enter your eyes, and you see different perspectives. Also, objects behind it are blocked from view (occluded) differently.
If you could generate all those light wavefronts radiating out in many different directions from many different locations without a physical object, you would create a true holographic recreation of that object (see Fig. 1b). This is exactly what SolidLight does. Conventional stereoscopic images can’t do it—all the light is coming from one plane. Even if each eye is presented with a slightly displaced 2D vantage point, it’s images, not objects—not a true hologram by any means.
The problems that Light Field Lab has solved include scale, density, and computation. To effectively form a true holographic object, you need to generate and control the direction and amplitude of tens or hundreds of billions of wavefronts, which correspond to pixels on a 2D display. Keep in mind that a 4K display has 8.3 million pixels, while a state-of-the-art 8K display has 33 million pixels; compare that with the 10 billion pixels per square meter generated by SolidLight!
It’s important to understand that all the wavefronts for a scene are present at once, and your eyes can focus on any point in the scene at any time, just like in real life. Areas you don’t focus on exhibit retinal blur—again, just as in real life. All the attributes of light in the real world—reflection, refraction, diffraction, etc.—are faithfully reproduced. And most importantly, no special glasses, head tracking, or other accessories are required.
How does Light Field Lab do it? With technology that has already generated over 300 patent filings. The hardware is a modular SolidLight Surface video wall with three basic components. The first layer is a silicon-based phased array of light-emitting devices in what the company calls a “nanoparticle polymer.” Company reps would not specify exactly what these devices are; apparently, they aren’t LEDs, and they are packed with much higher density than even a microLED panel. This layer also includes all the electronics that provide some serious processing power.
The middle layer is a modulated-amplitude plane, which, as the name implies, modulates the amplitude of light wavefronts from the photonic array under the control of the electronics. Like the photonic array, the array of light modulators consists of individually controllable sub-micron nanoparticles. This layer conditions the wavefronts for the final layer; if you were to look at the light output from the modulated-amplitude plane, it would just be a bunch of noise.
Finally, the PhaseGuide modulation plane is a static optical layer designed in coordination with the other layers to bring projected objects into focus. The precise optical properties of these components can be configured by the customer—for example, do they want the objects to be two feet in front of the screen, behind the screen, or somewhere else?
The basic building block is a bezel-less submodule measuring over 6×4 inches with a resolution of 16,000×10,000 pixels—that’s 160 million pixels in about 27 square inches! Fifteen submodules are fused to form a square modular panel measuring about half a meter on a side with a staggering 2.5 billion pixels, which translates to 10 billion pixels per square meter. Multiple panels are then tiled to create a SolidLIght Surface video wall of just about any size, like any other modular LED video-wall system. According to the company, there is no problem with precision placement of light-emitting elements as there is in microLED displays; alignment and calibration are done in software.
Just about any 3D-scene data from CGI systems such as Unity, Unreal, Maya, Blender, or anything with depth information can be rendered by SolidLight’s WaveTracer software in real time without needing to be “pre-baked.” The encoded, vectorized photonic signal is analogous to a Dolby Atmos vectorized spatial-audio signal. The system can even synthesize depth info in 2D images.
In common video terms, SolidLight can display 10-bit WCG (wide color gamut) at 60 frames per second. It can operate at higher frame rates if that should be a priority for Light Field Lab’s customers in the future. It can also use slower rates to induce motion blur for artistic effect.
The demo itself was most impressive. One 28-inch (diagonal) panel was mounted behind a bookcase and surrounded with physical plastic plants. A holographic chameleon, affectionately called Chammie, moved slowly along a branch and changed color. I could walk around in front of the reptile, and the hologram completely occluded the real-world plants behind it, just as if it were a real, solid object. (The plants behind Cammie were actually hidden off to the side, and their image was “relayed” into the demo area and combined with that of the chameleon. They couldn’t be where they seemed to be physically, because that would have blocked the light from the panel.)
I was invited to reach out and “touch” Cammie, but of course, I couldn’t, because there was nothing physical there. Holding a magnifying glass up to the object looked exactly like it would with a physical chameleon. At the sound of a buzzing fly, Cammie’s tongue shot out, and one of the physical plants in front of it actually shook in response thanks to a synchronized actuator—very clever!
The system was configured to present objects within a 14-inch-diagonal by 6-inch-deep volume about two feet in front of the panel itself with a viewing angle of at least 100 degrees and a target viewing distance of about three feet. These parameters are all balanced within a total “photon budget;” you can increase one by decreasing others, or you could increase the total photon budget by adding more SolidLight Surface panels. Also, the object brightness was calibrated to 100-200 nits for indoor applications, though it could be much brighter with increased voltage and decreased lifespan like any other emissive display.
I also saw another demo that I can’t say much about because of an NDA (non-disclosure agreement). I can say it was an interactive holographic object that responded to my movements in real time, which was truly amazing!
When can you buy one?
The first applications will likely be public-entertainment venues such as museums, corporate video walls, and broadcast/virtual-production facilities, which we could see in as little as one to three years. In fact, the company has already pre-sold its entire first production run. Next in the product-development roadmap are tabletop displays for product visualization and gaming. Then, a telepresence wall would allow remote colleagues to interact with the same holographic object. Finally, we could see consumer video walls for home installation in the not-too-distant future.
And the cost? Light Field Lab says it’s competitive with large, premium 8K microLED video walls; a large Sony Crystal LED wall will easily set you back seven figures. Clearly, SolidLIght won’t be within reach of most consumers for quite a while, but institutional customers could drop that kind of money for something so extraordinary.
In fact, the financing for such an ambitious project has come from the likes of Comcast/NBCUniversal, Samsung, Verizon, Bosch, and others—key stakeholders in the business of display technology, manufacturing, communication, and media who obviously believe this is the Next Big Thing. I can’t wait to see SolidLight technology deployed in a commercial venue, and I look forward to following its progress with great interest.