Authors
Paul Debevec,Tim Hawkins,Chris Tchou,Haarm-Pieter Duiker,Westley Sarokin,Mark Sagar
University of California at Berkeley; LifeF/X, Inc.
Portals
Summary
This paper introduced the Light Stage prototype to illuminate the subject from a dense sampling of directions of incident illumination. The method can render faces under arbitrary changes in lighting and viewing direction based on recorded imagery.
Abstract
We present a method to acquire the reflectance field of a human face and use these measurements to render the face under arbitrary changes in lighting and viewpoint. We first acquire images of the face from a small set of viewpoints under a dense sampling of incident illumination directions using a light stage. We then construct a reflectance function image for each observed image pixel from its values over the space of illumination directions. From the reflectance functions, we can directly generate images of the face from the original viewpoints in any form of sampled or computed illumination. To change the viewpoint, we use a model of skin reflectance to estimate the appearance of the reflectance functions for novel viewpoints. We demonstrate the technique with synthetic renderings of a person's face under novel illumination and viewpoints.
Contribution
- Light Stage prototype
- In this paper we develop a method to render faces under arbitrary changes in lighting and viewing direction based on recorded imagery
- We present a technique to extrapolate a complete reflectance field from the acquired data which allows us to render the face from novel viewpoints
Related Works
Facial Modeling and Animation; Reflectometry; Image-Based Modeling and Rendering
Overview
This paper takes a sparse set of viewpoints under a dense set of lighting directions and transforms each facial pixel location into a reflectance function. With this representation, this method can render the face from the any novel viewpoints under any novel form of illumination