top of page

Stanford engineers develop compact 3D holographic glasses suitable for all-day wear

A near-eye display design that pairs inverse-designed metasurface waveguides with AI-driven holographic displays to enable full-colour 3D augmented reality from a compact glasses-like form factor.
A near-eye display design that pairs inverse-designed metasurface waveguides with AI-driven holographic displays to enable full-colour 3D augmented reality from a compact glasses-like form factor. (CREDIT: Andrew Brodhead)

Researchers in spatial computing have unveiled a groundbreaking augmented reality (AR) headset that integrates holographic imaging into a sleek, everyday glasses form. Unlike bulky AR headsets currently on the market, this new prototype provides a high-quality 3D viewing experience in a compact, comfortable design suitable for all-day wear.


“Our headset looks like an ordinary pair of glasses to the outside world, but through the lenses, the wearer sees an enriched world overlaid with vibrant, full-color 3D computed imagery,” explained Gordon Wetzstein, an associate professor of electrical engineering at Stanford University and an expert in spatial computing. Wetzstein and his team presented their innovative device in a recent paper published in Nature.


 
 

Although still a prototype, this technology promises to revolutionize various fields, from gaming and entertainment to education and training. Manu Gopakumar, a doctoral student in Wetzstein's lab and co-first author of the paper, highlighted potential applications: “One could imagine a surgeon wearing such glasses to plan a delicate or complex surgery, or an airplane mechanic using them to learn to work on the latest jet engine.”




Overcoming Technical Barriers


This AR headset represents a significant advancement, overcoming previous challenges that resulted in either cumbersome devices or subpar 3D experiences, often causing visual fatigue or nausea. “There is no other AR system with a comparable compact form factor or matching our 3D image quality,” stated Gun-Yeal Lee, a postdoctoral researcher at Stanford and co-first author of the paper.


 
 

The team's success lies in combining AI-enhanced holographic imaging with novel nanophotonic device approaches. Traditional AR systems rely on complex optical systems where users view a digitized approximation of the real world combined with computed imagery. This method necessitates bulky headsets due to the need for magnifying lenses and minimum distances between the eye, lenses, and screens.


“Beyond bulkiness, these limitations can also lead to unsatisfactory perceptual realism and, often, visual discomfort,” added Suyeon Choi, a doctoral student and co-author of the paper.


 

Related Stories:

 

Leapfrogging Traditional Approaches


To achieve a more visually satisfying 3D experience, Wetzstein’s team bypassed traditional stereoscopic methods in favor of holography, a technique with Nobel-winning origins from the late 1940s.


Despite its potential, holography has struggled with accurately portraying 3D depth cues, often resulting in an underwhelming and sometimes nausea-inducing visual experience.


 
 

The team used AI to enhance the depth cues in holographic images. Advances in nanophotonics and waveguide display technologies enabled them to project computed holograms onto the lenses without bulky additional optics.


Researchers compare video holograms of the same scene produced with a conventional free-space propagation model and with our learned physical waveguide model. (CREDIT: Stanford Computational Research Lab)


Waveguides, created by etching nanometer-scale patterns onto the lens surface, play a crucial role. Small holographic displays mounted at each temple project computed imagery through these etched patterns, bouncing light within the lens before delivering it directly to the viewer’s eye. This allows users to see both the real world and full-color, 3D computed images simultaneously.


 
 

Using nanophotonic technologies called metasurface optics, the researchers designed a novel waveguide that can relay 3D hologram information of RGB visible light into a single compact device with high transparency. These nanophotonic waveguide samples were fabricated in-house at the Stanford Nanofabrication Facility and Stanford Nano Shared Facilities.




Enhanced 3D Experience


The enhanced 3D effect results from both stereoscopic imaging, where each eye sees a slightly different image, and holography, which provides the full 3D volume in front of each eye. “With holography, you also get the full 3D volume in front of each eye, increasing the life-like 3D image quality,” said Brian Chao, another doctoral student and co-author.


 
 

The culmination of new waveguide display techniques and improvements in holographic imaging delivers a true-to-life 3D visual experience that is both visually satisfying and devoid of the fatigue that plagued earlier approaches.



“Holographic displays have long been considered the ultimate 3D technique, but they’ve never quite achieved that big commercial breakthrough,” Wetzstein remarked. “Maybe now they have the killer app they’ve been waiting for all these years.”


 
 

This AR headset’s compact and comfortable design, combined with superior 3D image quality, represents a significant leap forward in the field of spatial computing, potentially paving the way for widespread adoption and new applications across various industries.






For more science news stories check out our New Innovations section at The Brighter Side of News.


 

Note: Materials provided above by The Brighter Side of News. Content may be edited for style and length.


 
 

Like these kind of feel good stories? Get the Brighter Side of News' newsletter.


 

Comments


Most Recent Stories

bottom of page