Reconfigurable tasks in belief-space planning


We propose a task representation for use in a belief-space planning framework. The representation is based on specialized object models that enable estimation of an abstract state of a robot with respect to an object. Each manipulation task is represented using a partition over these states defined by the set of known object models. Solutions to such tasks are constructed in a belief-space planner using visual and/or manual interactions with objects that condense belief in a target subset of the task partition. This partition integrates belief over states into a task belief without altering the original belief representation. As a result, sequences of tasks can be addressed that inherit the complete estimate of state over the entire history of observations. Demonstrations of the technique are presented in simulation and on a real robot. Results show that using this task representation and the belief-space planner, the robot is able to recognize objects, find target objects, and manipulate a set of objects to obtain a desired state.

DOI: 10.1109/HUMANOIDS.2016.7803431

Extracted Key Phrases

9 Figures and Tables

Cite this paper

@inproceedings{Ruiken2016ReconfigurableTI, title={Reconfigurable tasks in belief-space planning}, author={Dirk Ruiken and Tiffany Q. Liu and Takeshi Takahashi and Roderic A. Grupen}, booktitle={Humanoids}, year={2016} }