You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Our method takes as input a collection of images (100 in our experiments) with known cameras, and outputs the volumetric density and normals, materials (BRDFs), and far-field illumination (environment map) of the scene.
Experiment configurations are done using hydra, which controls the initialization parameters for all of the modules. Look in configs/model to
see what options are available. Setting the BRDF activation would look like adding this:
model.arch.model.brdf.activation="sigmoid"
to the command line argument.
To relight a dataset, you need to first convert the environment map .exr file to a pytorch checkpoint {envmap}.th like this:
Note that something is currently wrong with computation of metrics in the current code and the scripts reval_lpips.ipynb and reeval_norm_err.ipynb currently have to be run. tabularize.ipynb can be used to create the tables, while other fun visualizations are available.
You can also download our relighting experiments from here.
Other datasets
Other dataset configurations are available in configs/dataset. Real world datasets are available and do work.
Our method takes as input a collection of images (100 in our experiments) with known cameras, and outputs the volumetric density and normals, materials (BRDFs), and far-field illumination (environment map) of the scene.