Holographic AR Glasses with Metasurface Waveguides | Nature 2024

Manu Gopakumar*, Gun-Yeal Lee*, Suyeon Choi, Brian Chao, Yifan Peng, Jonghyun Kim, Gordon Wetzstein

A near-eye display design that pairs inverse-designed metasurface waveguides with AI-driven holographic displays to enable full-colour 3D augmented reality from a compact glasses-like form factor.

 



Photo by Andrew Brodhead

ABSTRACT

Emerging spatial computing systems seamlessly superimpose digital information on the physical environment observed by a user, enabling transformative experiences across various domains, such as entertainment, education, communication and training. However, the widespread adoption of augmented-reality (AR) displays has been limited due to the bulky projection optics of their light engines and their inability to accurately portray three-dimensional (3D) depth cues for virtual content, among other factors. Here we introduce a holographic AR system that overcomes these challenges using a unique combination of inverse-designed full-colour metasurface gratings, a compact dispersion-compensating waveguide geometry and artificial-intelligence-driven holography algorithms. These elements are co-designed to eliminate the need for bulky collimation optics between the spatial light modulator and the waveguide and to present vibrant, full-colour, 3D AR content in a compact device form factor. To deliver unprecedented visual quality with our prototype, we develop an innovative image formation model that combines a physically accurate waveguide model with learned components that are automatically calibrated using camera feedback. Our unique co-design of a nanophotonic metasurface waveguide and artificial-intelligence-driven holographic algorithms represents a significant advancement in creating visually compelling 3D AR experiences in a compact wearable device.

Comparing AR GLASSES DESIGNS

Conventional AR glasses use amplitude SLMs, such as organic light-emitting diodes or micro light-emitting diodes, which require a projector-based light engine that is typically at least as thick as the focal length of the projection lens. The design of our holographic AR glasses uses a phase-only SLM that can be mounted very close to the in-coupling grating, thereby minimizing the device form factor. Additionally, unlike conventional AR glasses, our holographic design can provide full 3D depth cues for virtual content.

Inverse-Designed METASURFACE WAVEGUIDE

The geometry of our waveguide is designed to accommodate the different propagation angles for red, green, and blue wavefronts. Specifically, we constrain our geometry to maintain the independent degrees-of-freedom needed for full-colour 3D holograms by coupling out a single copy of the propagating wavefronts for all three colors.

Our metasurface gratings have optimized nanoscale features, shown on the left, to efficiently diffract light into and out of our waveguide. The period (Λ) and height (H) of the metasurfaces are 384 nm and 220 nm respectively. The normalized magnetic field maps, shown on the right, illustrate the diffraction by metasurface gratings at our out-coupler for red (638 nm), green (521 nm) and blue (445 nm) wavelengths. The black arrows illustrate the wave vectors of the incident and diffracted light. Scale bar: 400 nm.

Scanning electron microscope images demonstrate the successful fabrication of our metasurface design. Scale bars: 2 μm (left), 200 nm (right).

LEARNED PHYSICAL WAVEGUIDE MODEL

We construct a learned physical waveguide model to accurately predict the output of our system by combining physical aspects of the waveguide (highlighted in green) with artificial-intelligence components that are learned from camera feedback (highlighted in orange). Our analytically-derived physical terms include the converging illumination that illuminates the SLM and frequency-dependent coefficients to model the propagation within the waveguide. Our learned components include parameterized spatial maps to characterize any variation in the diffraction efficiency across our fabricated metasurface gratings and CNNs for modelling content-dependent non-idealities such as the spatially-varying non-linear response of the SLM. The model is fully differentiable, enabling simple gradient descent CGH algorithms to compute phase patterns for arbitrary target scenes at runtime.

 

Press

 
 

 

 

 

 

 

Experimental Holograms

The following results are views of experimental holograms captured by a camera looking through a prototype implementation of our AR glasses design.

Here we compare video holograms of the same scene produced with a conventional free-space propagation model and with our learned physical waveguide model. The comparison illustrates how our mixture of an analytical waveguide model and AI algorithms are crucial for enabling high-quality full-color holograms.

This example illustrates how our AR glasses design can uniquely present full-color 3D video holograms. Phase modulation patterns displayed on our SLM, such as those shown on the left, produce a wavefront that propagates through the waveguide to reconstruct a view of a full 3D scene through the glasses, as shown on the right.

In this 3D AR scene, a virtual robot, Rubik’s Cube, and bike are presented alongside a real toy duck and action figure. We sweep the camera to focus on the different objects to highlight how 3D augmented reality content can be presented at different depths through our waveguide.

Notably, the components of our holographic AR glasses required to produce these holograms can fit into a wearable form factor as illustrated by the 3D-printed prototype below:


FILES

CITATION

Gopakumar, M. et al. Full-colour 3D holographic augmented-reality displays with metasurface waveguides. Nature (2024).

BibTeX

@article{Gopakumar:2024:HolographicAR,
author = {Manu Gopakumar and Gun-Yeal Lee and Suyeon Choi and Brian Chao and Yifan Peng and Jonghyun Kim and Gordon Wetzstein},
title = {{Full-colour 3D holographic augmented-reality displays with metasurface waveguides}},
journal = {Nature},
year = {2024},
}

Related Projects

You may also be interested in related projects, where we developed AI algorithms to produce high quality 2D and 3D holograms:

  • S. Choi et al. “Time-Multiplexed Neural Holography”, ACM SIGGRAPH, 2022 (link)
  • S. Choi et al. “Neural 3D Holography”, ACM SIGGRAPH Asia, 2021 (link)
  • Y. Peng et al. “Neural Holography”, ACM SIGGRAPH Asia 2020 (link)

and holographic near-eye display designs for compact VR glasses:

  • J. Kim et al. “Holographic Glasses”, SIGGRAPH, 2022 (link)
  • M. Gopakumar et al. “Unfiltered Holography”, Optics Letters, 2021 (link)

 

Acknowledgements

M.G. is supported by a Stanford Graduate Fellowship in Science and Engineering. G.-Y.L. is supported by a Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (2022R1A6A3A03073823). S.C. is supported by a Kwanjeong Scholarship and a Meta Research PhD Fellowship. B.C. is supported by a Stanford Graduate Fellowship in Science and Engineering and a National Science Foundation Graduate Research Fellowship. G.W. is supported by the ARO (PECASE Award W911NF-19-1-0120), Samsung and the Sony Research Award Program. Part of this work was performed at the Stanford Nano Shared Facilities (SNSF) and Stanford Nanofabrication Facility (SNF), supported by the National Science Foundation and the National Nanotechnology Coordinated Infrastructure under award ECCS-2026822. We also thank Y. Park for her ongoing support.

MEDIA

Phase-modulating SLM and metasurface waveguide illuminated by a laser in the experimental setup (Photo by Andrew Brodhead).
Research team at Stanford from left-to-right: Brian Chao, Manu Gopakumar, Gun-Yeal Lee, Gordon Wetzstein, Suyeon Choi (Photo by Andrew Brodhead).