Real-Time Lighting Estimation for Augmented Reality via Differentiable Screen-Space Rendering

IEEE Transactions on Visualization and Computer Graphics(2023)

Cited 4|Views57
No score
Abstract
Augmented Reality (AR) applications aim to provide realistic blending between the real-world and virtual objects. One of the important factors for realistic AR is the correct lighting estimation. In this article, we present a method that estimates the real-world lighting condition from a single image in real time, using information from an optional support plane provided by advanced AR frameworks (e.g., ARCore, ARKit, etc.). By analyzing the visual appearance of the real scene, our algorithm can predict the lighting condition from the input RGB photo. In the first stage, we use a deep neural network to decompose the scene into several components: lighting, normal, and Bidirectional Reflectance Distribution Function (BRDF). Then we introduce differentiable screen-space rendering, a novel approach to providing the supervisory signal for regressing lighting, normal, and BRDF jointly. We recover the most plausible real-world lighting condition using Spherical Harmonics and the main directional lighting. Through a variety of experimental results, we demonstrate that our method can provide improved results than prior works quantitatively and qualitatively, and it can enhance the real-time AR experiences.
More
Translated text
Key words
Lighting,Rendering (computer graphics),Real-time systems,Estimation,Probes,Image reconstruction,Geometry,Mixed/augmented reality,rendering,scene understanding,light estimation,real time
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined