Dynamic Modeling of Hand-Object Interactions via Tactile Sensing

@article{Zhang2021DynamicMO,
  title={Dynamic Modeling of Hand-Object Interactions via Tactile Sensing},
  author={Qiang Zhang and Yunzhu Li and Yiyue Luo and Wan Shou and Michael Foshey and Junchi Yan and Joshua B. Tenenbaum and Wojciech Matusik and Antonio Torralba},
  journal={2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
  year={2021},
  pages={2874-2881}
}
  • Qiang Zhang, Yunzhu Li, A. Torralba
  • Published 9 September 2021
  • Computer Science
  • 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
Tactile sensing is critical for humans to perform everyday tasks. While significant progress has been made in analyzing object grasping from vision, it remains unclear how we can utilize tactile sensing to reason about and model the dynamics of hand-object interactions. In this work, we employ a high-resolution tactile glove to perform four different interactive activities on a diversified set of objects. We propose a framework aiming at predicting the 3d locations of both the hand and the… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 70 REFERENCES
Learning the signatures of the human grasp using a scalable tactile glove
TLDR
Tactile patterns obtained from a scalable sensor-embedded glove and deep convolutional neural networks help to explain how the human hand can identify and grasp individual objects and estimate their weights.
Manipulation by Feel: Touch-Based Control with Deep Predictive Models
TLDR
This paper proposes deep tactile MPC, a framework for learning to perform tactile servoing from raw tactile sensor inputs, without manual supervision, and shows that this method enables a robot equipped with a GelSight-style tactile sensor to manipulate a ball, analog stick, and 20-sided die.
Interpreting and Predicting Tactile Signals via a Physics-Based and Data-Driven Framework
TLDR
This research extends robotic tactile sensing beyond reduced-order models through the automated creation of a precise tactile dataset for the BioTac over diverse physical interactions, and neural-network-based mappings from rawBioTac signals to low-dimensional experimental data, and more importantly, high-density FE deformation fields.
Grip Stabilization of Novel Objects Using Slip Prediction
TLDR
This article forms a supervised-learning problem to predict the future occurrence of slip from high-dimensional tactile information provided by a BioTac sensor, and demonstrates how different input features, slip prediction time horizons, and available tactile information channels, impact prediction accuracy.
ContactDB: Analyzing and Predicting Grasp Contact via Thermal Imaging
TLDR
This work presents ContactDB, a novel dataset of contact maps for household objects that captures the rich hand-object contact that occurs during grasping, enabled by use of a thermal camera.
Towards force sensing from vision: Observing hand-object interactions to infer manipulation forces
TLDR
A novel, non-intrusive approach for estimating contact forces during hand-object interactions relying solely on visual input provided by a single RGB-D camera, and shows that force sensing from vision (FSV) is indeed feasible.
Learning human–environment interactions using conformal tactile textiles
TLDR
A textile-based tactile learning platform that can be used to record, monitor and learn human–environment interactions and it is shown that the artificial-intelligence-powered sensing textiles can classify humans’ sitting poses, motions and other interactions with the environment.
iCLAP: shape recognition by combining proprioception and touch sensing
TLDR
A novel method named Iterative Closest Labeled Point (iCLAP) is presented to link the kinesthetic cues and tactile patterns fundamentally and also introduce its extensions to recognize object shapes.
Making Sense of Vision and Touch: Self-Supervised Learning of Multimodal Representations for Contact-Rich Tasks
TLDR
This work uses self-supervision to learn a compact and multimodal representation of sensory inputs, which can then be used to improve the sample efficiency of the policy learning of deep reinforcement learning algorithms.
Intelligent Carpet: Inferring 3D Human Pose from Tactile Signals
TLDR
A low-cost, high-density, large-scale intelligent carpet is built, which enables the real-time recordings of human-floor tactile interactions in a seamless manner and a deep neural network model is designed and implemented to infer 3D human poses using only the tactile information.
...
...