NeRS: Neural Reflectance Surfaces for Sparse-View 3D Reconstruction in the Wild


Jason Y. Zhang
Gengshan Yang
Shubham Tulsiani*
Deva Ramanan*


Carnegie Mellon University


NeurIPS 2021





Reconstructed cars from the Multiview Marketplace Cars dataset. Given several (8-16) unposed images of the same instance, NeRS outputs a textured 3D reconstruction along with the illumination parameters.

We demonstrate the generality of NeRS on assorted objects.


Abstract

Recent history has seen a tremendous growth of work exploring implicit representations of geometry and radiance, popularized through Neural Radiance Fields (NeRF). Such works are fundamentally based on a (implicit) volumetric representation of occupancy, allowing them to model diverse scene structure including translucent objects and atmospheric obscurants. But because the vast majority of real-world scenes are composed of well-defined surfaces, we introduce a surface analog of such implicit models called Neural Reflectance Surfaces (NeRS). NeRS learns a neural shape representation of a closed surface that is diffeomorphic to a sphere, guaranteeing water-tight reconstructions. Even more importantly, surface parameterizations allow NeRS to learn (neural) bidirectional surface reflectance functions (BRDFs) that factorize view-dependent appearance into environmental illumination, diffuse color (albedo), and specular "shininess." Finally, rather than illustrating our results on synthetic scenes or controlled in-the-lab capture, we assemble a novel dataset of multiview images from online marketplaces for selling goods. Such "in-the-wild" multiview image sets pose a number of challenges, including a small number of views with unknown/rough camera estimates. We demonstrate that surface-based neural reconstructions enable learning from such data, outperforming volumetric neural rendering-based reconstructions. We hope that NeRS serves as a first step toward building scalable, high-quality libraries of real-world shape, materials, and illumination.




Paper

Paper thumbnail.

NeRS: Neural Reflectance Surfaces for Sparse-View 3D Reconstruction in the Wild

Jason Y. Zhang, Gengshan Yang, Shubham Tulsiani*, and Deva Ramanan*
@inproceedings{zhang2021ners,
  title={{NeRS}: Neural Reflectance Surfaces for Sparse-view 3D Reconstruction in the Wild},
  author={Zhang, Jason Y. and Yang, Gengshan and Tulsiani, Shubham and Ramanan, Deva},
  booktitle={Conference on Neural Information Processing Systems},
  year={2021}
}




Video




Code

Model overview figure
[GitHub]


Data


[Multi-view Marketplace Cars (on Google Drive)]
Directions on how to download and use the data are on the Github readme.


Reconstructing Ukrainian Churches

Check out Michael Hasey's thesis analyzing the architecture of Ukrainian churches by reconstructing over 300 NeRS models from online images!





Acknowledgements

This work was supported in part by the NSF GFRP (Grant No. DGE1745016), Singapore DSTA, and CMU Argo AI Center for Autonomous Vehicle Research. Webpage template.