Authors
Alexandr Kuznetsov, Xuezheng Wang, Krishna Mullia, Fujun Luan, Zexiang Xu, Milo
University of California, San Diego; Adobe Research
Portals
Abstract
Neural material reflectance representations address some limitations of traditional analytic BRDFs with parameter textures; they can theoretically represent any material data, whether a complex synthetic microgeometry with displacements, shadows and interreflections, or real measured reflectance. However, they still approximate the material on an infinite plane, which prevents them from correctly handling silhouette and parallax effects for viewing directions close to grazing. The goal of this paper is to design a neural material representation capable of correctly handling such silhouette effects. We extend the neural network query to take surface curvature information as input, while the query output is extended to return a transparency value in addition to reflectance. We train the new neural representation on synthetic data that contains queries spanning a variety of surface curvatures. We show an ability to accurately represent complex silhouette behavior that would traditionally require more expensive and less flexible techniques, such as on-the-fly geometry displacement or ray marching.
Contribution
- We introduce the first spatially-varying material representation handling silhouette effects using a single query, i.e. without any on-the-fly geometry displacement or ray marching, based on the concept of Silhouette BTF (Sec. 4.1)
- Building upon the NeuMIP architecture, we achieve this by explicitly considering curvature as part of the material query (Sec. 4.2), and by considering transparency as part of the query output (Sec. 4.3)
- Training on datasets designed to learn the correct behavior across surfaces with varying curvatures (Sec. 4.4) and integration into a practical renderer (Sec. 4.5)
Related Works
Displacement Mapping and Its Variations; BTFs; Neural Reflectance; Thin Volumetric Shells