• Corpus ID: 7170651

A Robotic System for Reaching in Dense Clutter that Integrates Model Predictive Control, Learning, Haptic Mapping, and Planning

@inproceedings{Bhattacharjee2014ARS,
  title={A Robotic System for Reaching in Dense Clutter that Integrates Model Predictive Control, Learning, Haptic Mapping, and Planning},
  author={Tapomayukh Bhattacharjee and Phillip M. Grice and Ariel Kapusta and Marc D. Killpack and Daehyung Park and Charles C. Kemp},
  year={2014}
}
We present a system that enables a robot to reach locations in dense clutter using only haptic sensing. Our system integrates model predictive control [1], learned initial conditions [2], tactile recognition of object types [3], haptic mapping, and geometric planning to efficiently reach locations using wholearm tactile sensing [4]. We motivate our work, present a system architecture, summarize each component of the system, and present results from our evaluation of the system reaching to… 

Figures from this paper

Interleaving planning and control for efficient haptically-guided reaching in unknown environments

A haptically-guided interleaving planning and control method with a haptic mapping framework that uses a previously published contact-regulating controller based on model predictive control for reaching in an initially unknown environment with only haptic sensing.

Model predictive control for fast reaching in clutter

This paper presents a multi-time-step MPC formulation that enables a robot to rapidly reach a target position in dense clutter, while regulating whole-body contact forces to be below a given threshold, and conducts trials using a real 7 degree-of-freedom humanoid robot arm with whole-arm tactile sensing.

Adaptive motion control in uncertain environments using tactile feedback

This paper shows the performance of a tactile feedback controller in joint-space, which is not bound to the null space of the manipulator, and extends the tactile feedback control framework to hierarchical multi-space controllers with adaptive prioritization.

Combining tactile sensing and vision for rapid haptic mapping

An iterative algorithm is presented that enables a robot to infer dense haptic labels across visible surfaces when given a color-plus-depth (RGB-D) image along with a sequence of sparse haptically-labeled labels representative of what could be obtained via tactile sensing.

Motion Planning for Manipulators in Unknown Environments with Contact Sensing Uncertainty

This work presents the Collision Hypothesis Sets representation for computing a belief of occupancy from observations, and introduces a planning and control architecture that uses this representation to navigate through unknown environments.

Learning to navigate cloth using haptics

A controller that allows an armlike manipulator to navigate deformable cloth garments in simulation through the use of haptic information is presented and successful navigation of a robotic arm through a variety of garments is demonstrated.

A CRF that combines touch and vision for haptic mapping

This work presents an algorithm that uses touch and vision to efficiently produce a dense haptic map and shows that this algorithm can use a convolutional neural network for material recognition from Bell et al. that the authors modified and fine-tuned.

Inferring Object Properties with a Tactile-Sensing Array Given Varying Joint Stiffness and Velocity

This paper developed an idealized physics-based lumped element model of a robot with a compliant joint making contact with an object and found that, in contrast to 1-NN, the performance of LSTMs and multivariate HMMs successfully generalized to new robot motions with distinct velocities and joint stiffnesses.

Contact Localization for Robot Arms in Motion without Torque Sensing

This work proposes using Domain Randomization to train a neural network to localize contacts of robot arms in motion without joint torque observations, and uses a novel cylindrical projection encoding of the robot arm surface, which allows the network to use convolution layers to process input features and transposed convolved layers to predict contacts.

Inferring Object Properties from Incidental Contact with a Tactile Sensing Forearm

It is demonstrated that data-driven methods can be used to infer mechanical properties of objects from incidental contact with a robot's forearm and multivariate HMMs achieved high cross-validation accuracy and successfully generalized to new robot motions with distinct velocities and joint stiffnesses.

References

SHOWING 1-10 OF 43 REFERENCES

Reaching in clutter with whole-arm tactile sensing

It is demonstrated that robots can use whole-arm tactile sensing to perceive clutter and maneuver within it, while keeping contact forces low, and a novel controller is presented that only requires haptic sensing, handles multiple contacts, and does not need an explicit model of the environment prior to contact.

A Vision-Based System for Grasping Novel Objects in Cluttered Environments

This paper tests the vision-based robotic grasping system using the STAIR (STanford AI Robots) platforms on two experiments: grasping novel objects and unloading items from a dishwasher.

HERB: a home exploring robotic butler

New algorithms for searching for objects, learning to navigate in cluttered dynamic indoor scenes, recognizing and registering objects accurately in high clutter using vision, manipulating doors and other constrained objects using caging grasps, grasp planning and execution in clutter, and manipulation on pose and torque constraint manifolds are presented.

Fast reaching in clutter while regulating forces using model predictive control

The work in this paper explicitly modeling robot arm dynamics and using model predictive control with whole-arm tactile sensing to improve the speed and force control and describes a constraint that regulates joint velocities in order to mitigate unexpected impact forces while reaching to a goal.

Arm Teleoperation in Clutter Using Virtual Constraints from Real Sensor Data

We introduce CAT, a constraint-aware teleoperation method that can track continuously updating 6-DOF end-effector goals while avoiding environment collisions, self-collisions, and joint limits. Our

Learning to reach into the unknown: Selecting initial conditions when reaching in clutter

A data-driven approach that greatly improves a robot's success at reaching to a goal location in the unknown interior of an environment based on observable external properties, such as the category of the clutter and the locations of openings into the clutter.

Methods for collision-free arm teleoperation in clutter using constraints from 3D sensor data

We introduce CAT, a constraint-aware teleoperation method that can track continuously updating 6-DOF end-effector goals while avoiding environment collisions, self-collisions, and joint limits. Our

Rapid categorization of object properties from incidental contact with a tactile sensing robot arm

It is demonstrated that data-driven methods can be used to rapidly categorize objects encountered through incidental contact on a robot arm, and a taxel-by-taxel classification approach can successfully categorize simultaneous contacts with multiple objects and can also identify outlier objects in the environment based on the prior associated with an object's likelihood in the given environment.

Navigation in three-dimensional cluttered environments for mobile manipulation

This work presents a fast, integrated approach to solve path planning in 3D using a combination of an efficient octree-based representation of the 3D world and an anytime search-based motion planner to improve planning speed.

Global Manipulation Planning in Robot Joint Space With Task Constraints

  • M. Stilman
  • Computer Science
    IEEE Transactions on Robotics
  • 2010
This paper describes a representation of constrained motion for joint-space planners and develops two simple and efficient methods for constrained sampling of joint configurations: tangent-space sampling (TS) and first-order retraction (FR).