• Corpus ID: 53076136

Tactile-Visual Integration for Task-Aware Grasping

@inproceedings{Zhang2018TactileVisualIF,
  title={Tactile-Visual Integration for Task-Aware Grasping},
  author={Mabel M. Zhang and Renaud Detry and Kostas Daniilidis},
  year={2018}
}
Tactile sensing is beneficial for complex manipulation tasks beyond pick and place, shown by existing literature. While most existing work on perception for manipulation focus on vision, a pre-contact stage, manipulation cannot start until contact is validated and related conditions assessed. A few recent studies have integrated vision and touch for object perception and grasping; however, they do not tackle the spatial correspondence problem between modalities, which we believe gives… 

Figures from this paper

Tactile Perception And Visuotactile Integration For Robotic Exploration
TLDR
This work proposes and evaluates a learning-based method for visuotactile integration for grasping and investigates in making the tactile modality, local and slow by nature, more efficient for the task by predicting the most cost-effective moves using active exploration.

References

SHOWING 1-10 OF 39 REFERENCES
Grasping and Manipulation of Unknown Objects Based on Visual and Tactile Feedback
TLDR
This work introduces a framework for tactile servoing that can realize specific tactile interaction patterns, for example to establish and maintain contact (grasping) or to explore and manipulate objects.
A visuo-tactile control framework for manipulation and exploration of unknown objects
We present a novel hierarchical control framework that unifies our previous work on tactile-servoing with visual-servoing approaches to allow for robust manipulation and exploration of unknown
Grasp Pose Detection in Point Clouds
TLDR
A series of robotic experiments are reported that average a 93% end-to-end grasp success rate for novel objects presented in dense clutter, an improvement in grasp detection performance.
Grasp adjustment on novel objects using tactile experience from similar local geometry
  • H. Dang, P. Allen
  • Computer Science
    2013 IEEE/RSJ International Conference on Intelligent Robots and Systems
  • 2013
TLDR
This work extends the work to grasp novel objects by utilizing local geometric similarity by selecting a series of shape primitives to parameterize potential local geometries which novel objects may share in common and builds a tactile experience database that stores information of stable grasps on these local geometry.
Fusing visual and tactile sensing for 3-D object reconstruction while grasping
TLDR
This work proposes an optimal estimation approach for the fusion of visual and tactile data applying the constraint of object symmetry formulated as a state estimation problem and solved with an iterative extended Kalman filter.
The Feeling of Success: Does Touch Sensing Help Predict Grasp Outcomes?
TLDR
This work investigated the question of whether touch sensing aids in predicting grasp outcomes within a multimodal sensing framework that combines vision and touch, and evaluated visuo-tactile deep neural network models to directly predict grasp outcomes from either modality individually, and from both modalities together.
Deep learning for detecting robotic grasps
TLDR
This work presents a two-step cascaded system with two deep networks, where the top detections from the first are re-evaluated by the second, and shows that this method improves performance on an RGBD robotic grasping dataset, and can be used to successfully execute grasps on two different robotic platforms.
Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection
TLDR
The approach achieves effective real-time control, can successfully grasp novel objects, and corrects mistakes by continuous servoing, and illustrates that data from different robots can be combined to learn more reliable and effective grasping.
Task-relevant grasp selection: A joint solution to planning grasps and manipulative motion trajectories
TLDR
A manipulation capability index is defined, which is a function of both the task execution waypoints and the object grasping contact points, and it is shown how this index can be combined with a likelihood function computed by a probabilistic model of grasp selection, enabling the planning of grasps which have a high likelihood of being stable, but which also maximise the robot's capability to deliver a desired post-grasp task trajectory.
...
1
2
3
4
...