We experimentally captured 3D focal stacks of generated holograms on a holographic display prototype.
State-of-the-art neural rendering methods optimize a Gaussian-based scene representation from a few photographs for novel-view synthesis. Building on these representations, we develop efficient algorithms, dubbed Gaussian Wave Splatting, to turn these Gaussians into holograms. Unlike existing computer-generated holography (CGH), Gaussian Wave Splatting supports accurate occlusions and view-dependent effects for photorealistic scenes by leveraging recent advances in neural rendering.
Specifically, we derive a closed-form solution for a 2D Gaussian-to-hologram transform that supports occlusions and alpha blending. Inspired by classic computer graphics techniques, we also derive an efficient approximation of the aforementioned process in the Fourier domain that is easily parallelizable and implement it using custom CUDA kernels.
By integrating emerging neural rendering pipelines with holographic display technology, our Gaussian-based CGH framework paves the way for next-generation holographic displays.
Gaussian Wave Splatting (GWS) takes a set of optimized 2D Gaussians as input and outputs a hologram that can be directly displayed on emerging holographic displays. In the high level, each 2D Gaussian primitive is illuminated by a coherent illumination source, and the resulting illuminated and propagated wavefront is recorded on a spatial light modulator (SLM) to generate the hologram. Since the parameters of the Gaussians already explicitly encode 3D informtation, further splatting Gaussians using a physically-accurate wave propagation model allows for the direct reconstruction of 3D focal stacks with natural defocus blur when viewed on a holographic display.
Aside from deriving the exact mathematical formulation of the Fourier spectrum of arbitrarily oriented 2D Gaussians for splatting, we also derived the wave optics-counterpart of alpha blending (right) for Gaussian primitives, which is the major building block in the original geometric optics-based Gaussian splatting pipeline (left) to accurately model occlusion.
Point cloud
Polygon-based CGH
Ours GWS (full)
Ours GWS (fast)
(hover over the videos to pause)
GWS achieves superior 3D focal stack reconstruction quality compared to prior primitives-based computer generated holography algorithms based on point clouds and per-face textured meshes (polygon-based CGH). We also designed a fast variant of GWS based on an approximated volume rendering image formation model inspired by order-invariant transparency (OIT) in traditional computer graphics. We implemented custom CUDA kernels for this fast GWS variant, achieving a 30X speedup compared the full, exact GWS with only a small drop in image quality.
Select different scenes and CGH methods to compare the 3D focal stack reconstruction quality of holograms. The GWS models show superior image stack quality compared to traditional point cloud and mesh-based approaches.
Focal stack reconstruction