Authors
Jingyang Zhang, Yao Yao, Shiwei Li, Jingbo Liu, Tian Fang, David McKinnon, Yanghai Tsin, Long Quan
Apple
Portals
Abstract
We present a novel differentiable rendering framework for joint geometry, material, and lighting estimation from multi-view images. In contrast to previous methods which assume a simplified environment map or co-located flashlights, in this work, we formulate the lighting of a static scene as one neural incident light field (NeILF) and one outgoing neural radiance field (NeRF). The key insight of the proposed method is the union of the incident and outgoing light fields through physically-based rendering and inter-reflections between surfaces, making it possible to disentangle the scene geometry, material, and lighting from image observations in a physically-based manner. The proposed incident light and inter-reflection framework can be easily applied to other NeRF systems. We show that our method can not only decompose the outgoing radiance into incident lights and surface materials, but also serve as a surface refinement module that further improves the reconstruction detail of the neural surface. We demonstrate on several datasets that the proposed method is able to achieve state-of-the-art results in terms of the geometry reconstruction quality, material estimation accuracy, and the fidelity of novel view rendering.
Contribution
- Proposing a general light field representation by marrying one incident light field and one outgoing radiance field via PBR and inter-reflections
- Proposing an optimization scheme for joint geometry, material, and lighting estimation, which can be easily applied to the prevalent NeRF family for material decomposition and neural surface refinement
- Constructing a real-world linear HDR dataset for material estimation and other neural rendering tasks
Related Works
The Rendering Equation; Lighting Modeling for Material Estimation; Surface Optimization by Differentiable Rendering
Comparisons
Nerfactor, PhySG, Neural-PIL, NeRF, VolSDF, NeuS