An interactive approach for functional prototype recovery from a single RGBD image

Abstract

Inferring the functionality of an object from a single RGBD image is difficult for two reasons: lack of semantic information about the object, and missing data due to occlusion. In this paper, we present an interactive framework to recover a 3D functional prototype from a single RGBD image. Instead of precisely reconstructing the object geometry for the prototype, we mainly focus on recovering the object’s functionality along with its geometry. Our system allows users to scribble on the image to create initial rough proxies for the parts. After user annotation of high-level relations between parts, our system automatically jointly optimizes detailed joint parameters (axis and position) and part geometry parameters (size, orientation, and position). Such prototype recovery enables a better understanding of the underlying image geometry and allows for further physically plausible manipulation. We demonstrate our framework on various indoor objects with simple or hybrid functions.

DOI: 10.1007/s41095-016-0032-x

Extracted Key Phrases

9 Figures and Tables

Cite this paper

@article{Rong2016AnIA, title={An interactive approach for functional prototype recovery from a single RGBD image}, author={Yuliang Rong and Youyi Zheng and Tianjia Shao and Yin Yang and Kun Zhou}, journal={Computational Visual Media}, year={2016}, volume={2}, pages={87-96} }