RelPose: Predicting Probabilistic Relative Rotation for Single Objects in the Wild


Jason Y. Zhang
Deva Ramanan
Shubham Tulsiani


Carnegie Mellon University







Probabilistic Camera Rotation Estimation for Generic Objects. Left: Given two images of the same object, we predict a conditional distribution of relative camera viewpoint (rotation) that effectively handles symmetries and pose ambiguities. Right: Given a set of images, our approach outputs a configuration of camera rotations.



Abstract

We describe a data-driven method for inferring the camera viewpoints given multiple images of an arbitrary object. This task is a core component of classic geometric pipelines such as SfM and SLAM, and also serves as a vital pre-processing requirement for contemporary neural approaches (e.g. NeRF) to object reconstruction and view synthesis. In contrast to existing correspondence-driven methods that do not perform well given sparse views, we propose a top-down prediction based approach for estimating camera viewpoints. Our key technical insight is the use of an energy-based formulation for representing distributions over relative camera rotations, thus allowing us to explicitly represent multiple camera modes arising from object symmetries or views. Leveraging these relative predictions, we jointly estimate a consistent set of camera rotations from multiple images. We show that our approach outperforms state-of-the-art SfM and SLAM methods given sparse images on both seen and unseen categories. Further, our probabilistic approach significantly outperforms directly regressing relative poses, suggesting that modeling multimodality is important for coherent joint reconstruction. We demonstrate that our system can be a stepping stone toward in-the-wild reconstruction from multi-view datasets.




Paper


Paper thumbnail.

RelPose: Predicting Probabilistic Relative Rotation for Single Objects in the Wild

Jason Y. Zhang, Deva Ramanan, and Shubham Tulsiani
@InProceedings{zhang2022relpose,
    title = {{RelPose}: Predicting Probabilistic Relative Rotation for Single Objects in the Wild},
    author = {Zhang, Jason Y. and Ramanan, Deva and Tulsiani, Shubham},
    booktitle = {European Conference on Computer Vision},
    year = {2022},
}




Video




Learned Pairwise Distributions

Interpreting Distributions of Relative Rotations. Here, we render new viewpoints produced by applying a relative rotation to the camera for Image 1. Inspired by Implicit-PDF, we visualize rotation matrices by projecting them onto a 2-sphere where the x-axis represents yaw, y-axis represents pitch, and color represents roll. In the plots below, we visualize the distribution of relative rotations, where each circle corresponds to a query rotation. The size of the circle is proportional to its probability. Rotations with negligible probability are omitted.








Acknowledgements

This work was supported in part by the NSF GFRP (Grant No. DGE1745016), Singapore DSTA, and CMU Argo AI Center for Autonomous Vehicle Research. Webpage Template.