| CARVIEW |
Eclipse: Disambiguating Illumination and Materials
using Unintended Shadows
CVPR 2024 (Oral Presentation)
|
Dor Verbin
Google Research |
Ben Mildenhall
Google Research |
Peter Hedman
Google Research |
|
Jonathan T. Barron
Google Research |
Todd Zickler
Harvard University |
Pratul P. Srinivasan
Google Research |
Abstract
Decomposing an object's appearance into representations of its materials and the surrounding illumination is difficult, even when the object's 3D shape is known beforehand. This problem is ill-conditioned because diffuse materials severely blur incoming light, and is ill-posed because diffuse materials under high-frequency lighting can be indistinguishable from shiny materials under low-frequency lighting. We show that it is possible to recover precise materials and illumination—even from diffuse objects—by exploiting unintended shadows, like the ones cast onto an object by the photographer who moves around it. These shadows are a nuisance in most previous inverse rendering pipelines, but here we exploit them as signals that improve conditioning and help resolve material-lighting ambiguities. We present a method based on differentiable Monte Carlo ray tracing that uses images of an object to jointly recover its spatially-varying materials, the surrounding illumination environment, and the shapes of the unseen light occluders who inadvertently cast shadows upon it.
Video
Model
Results