Attention-Based Robot Learning of Haptic Interaction

@inproceedings{Moringen2020AttentionBasedRL,
  title={Attention-Based Robot Learning of Haptic Interaction},
  author={Alexandra Moringen and Sascha Fleer and Guillaume Walck and Helge J. Ritter},
  booktitle={EuroHaptics},
  year={2020}
}
Haptic interaction involved in almost any physical interaction with the environment performed by humans is a highly sophisticated and to a large extent a computationally unmodelled process. Unlike humans, who seamlessly handle a complex mixture of haptic features and profit from their integration over space and time, even the most advanced robots are strongly constrained in performing contact-rich interaction tasks. In this work we approach the described problem by demonstrating the success of… 

References

SHOWING 1-10 OF 15 REFERENCES
Learning efficient haptic shape exploration with a rigid tactile sensor array
TLDR
This work connects recent advances in recurrent models of visual attention with previous insights about the organisation of human haptic search behavior, exploratory procedures and haptic glances for a novel architecture that learns a generative model of haptic exploration in a simulated three-dimensional environment.
Modeling Target-Distractor Discrimination for Haptic Search in a 3D Environment
TLDR
A binary classifier is trained to perform target-distractor discrimination during unconstrained haptic search performed by sighted study participants who were blindfolded and it is shown that using recurrent networks, and therefore integrating information over time, improves the classification results.
A CRF that combines touch and vision for haptic mapping
TLDR
This work presents an algorithm that uses touch and vision to efficiently produce a dense haptic map and shows that this algorithm can use a convolutional neural network for material recognition from Bell et al. that the authors modified and fine-tuned.
Multi-contact haptic exploration and grasping with tactile sensors
Making Sense of Vision and Touch: Self-Supervised Learning of Multimodal Representations for Contact-Rich Tasks
TLDR
This work uses self-supervision to learn a compact and multimodal representation of sensory inputs, which can then be used to improve the sample efficiency of the policy learning of deep reinforcement learning algorithms.
Search procedures during haptic search in an unstructured 3D display
In this work, we focus on finding and characterizing stereotypical behavioral modes during haptic search in an unstructured 3D display. To this end, we introduce the notion of search procedures and
Manipulation by Feel: Touch-Based Control with Deep Predictive Models
TLDR
This paper proposes deep tactile MPC, a framework for learning to perform tactile servoing from raw tactile sensor inputs, without manual supervision, and shows that this method enables a robot equipped with a GelSight-style tactile sensor to manipulate a ball, analog stick, and 20-sided die.
Effects of Using Multiple Hands and Fingers on Haptic Performance
TLDR
Comparing the performance of fourteen blindfolded sighted participants on seven tactile-map tasks using seven finger conditions provides empirical evidence that multiple hands and fingers benefit haptic perception, but the benefits are more complex than simply extending the tactile field of ‘view’.
The use of exploratory procedures by blind and sighted adults and children
TLDR
Overall, performance was affected more by age than by visual status, and repeating the task led to increased efficiency in all groups, and actions were introduced to describe exploratory behaviors in more detail.
Bimanual Integration of Position and Curvature in Haptic Perception
TLDR
It is concluded that the lower thresholds in Experiment 1 for bimanual exploration compared to unimanUAL exploration are due to the integration of curvature, not position or uncertainty of the midsagittal plane in unimanual explorations.
...
...