Authors
Kai Zhang, Fujun Luan, Qianqian Wang, Kavita Bala, Noah Snavely
Cornell University
Portals
Abstract
We present PhySG, an end-to-end inverse rendering pipeline that includes a fully differentiable renderer and can reconstruct geometry, materials, and illumination from scratch from a set of RGB input images. Our framework represents specular BRDFs and environmental illumination using mixtures of spherical Gaussians, and represents geometry as a signed distance function parameterized as a Multi-Layer Perceptron. The use of spherical Gaussians allows us to efficiently solve for approximate light transport, and our method works on scenes with challenging non-Lambertian reflectance captured under natural, static illumination. We demonstrate, with both synthetic and real data, that our reconstructions not only enable rendering of novel viewpoints, but also physics-based appearance editing of materials and illumination.
Contribution
- PhySG, an end-to-end inverse rendering approach to this problem of jointly estimating lighting, material properties, and geometry from multi-view images of glossy objects under static illumination. Our pipeline utilizes spherical Gaussians to approximately and efficiently evaluate the rendering equation in closed form
- Compared to prior neural rendering approaches, we show that PhySG not only generalizes to novel viewpoints, but also enables physically-intuitive material editing and relighting
Related Works
Neural Rendering; Material and Environment Estimation; Joint Shape and Appearance Refinement; The Rendering Equation