Active Clothing Material Perception Using Tactile Sensing and Deep Learning

@article{Yuan2018ActiveCM,
  title={Active Clothing Material Perception Using Tactile Sensing and Deep Learning},
  author={Wenzhen Yuan and Yuchen Mo and Shaoxiong Wang and Edward H. Adelson},
  journal={2018 IEEE International Conference on Robotics and Automation (ICRA)},
  year={2018},
  pages={1-8}
}
  • Wenzhen Yuan, Yuchen Mo, +1 author E. Adelson
  • Published 2 November 2017
  • Computer Science, Engineering
  • 2018 IEEE International Conference on Robotics and Automation (ICRA)
Humans represent and discriminate the objects in the same category using their properties, and an intelligent robot should be able to do the same. In this paper, we build a robot system that can autonomously perceive the object properties through touch. We work on the common object category of clothing. The robot moves under the guidance of an external Kinect sensor, and squeezes the clothes with a GelSight tactile sensor, then it recognizes the 11 properties of the clothing according to the… Expand
Morphology-Specific Convolutional Neural Networks for Tactile Object Recognition with a Multi-Fingered Hand
TLDR
The current paper explores the possibility of using a morphology-specific convolutional neural network (MS-CNN) for distributed tactile sensors on multi-fingered hands, and achieves over-95% object recognition rate with 20 objects. Expand
Towards Effective Tactile Identification of Textures using a Hybrid Touch Approach
TLDR
This paper develops three machine-learning methods within a framework to discriminate between surface textures and shows that a good initial estimate can be obtained via touch data, which can be further refined via sliding; combining both touch and sliding data results in 98% classification accuracy over unseen test data. Expand
Teaching Cameras to Feel: Estimating Tactile Physical Properties of Surfaces From Images
TLDR
This work introduces the challenging task of estimating a set of tactile physical properties from visual information and develops a cross-modal framework comprised of an adversarial objective and a novel visuo-tactile joint classification loss. Expand
Texture Recognition Based on Perception Data from a Bionic Tactile Sensor
TLDR
A bionic tactile sensor is adopted to collect vibration data while sliding against materials of interest to establish three machine learning algorithms and a convolutional neural network to demonstrate high accuracy and excellent robustness. Expand
Vision-Based Tactile Sensor Mechanism for the Estimation of Contact Position and Force Distribution Using Deep Learning
TLDR
A vision-based tactile sensor system that utilizes the image-based information of the tactile sensor in conjunction with input loads at various motions to train the neural network for the estimation of tactile contact position, area, and force distribution is described. Expand
Soft Robotic Finger Embedded with Visual Sensor for Bending Perception
TLDR
A novel soft robotic finger embedded with the visual sensor is proposed for perception that consists of a colored soft inner chamber, an outer structure, and an endoscope camera and the bending perception algorithm based on image preprocessing and deep learning is proposed. Expand
Spatio-temporal Attention Model for Tactile Texture Recognition
TLDR
A novel Spatio-Temporal Attention Model (STAM) for tactile texture recognition, which is the very first of its kind to the best knowledge, is proposed and can be applied to facilitate robot tasks like grasping and manipulation. Expand
Lifelong Visual-Tactile Cross-Modal Learning for Robotic Material Perception
  • W. Zheng, H. Liu, F. Sun
  • Computer Science, Medicine
  • IEEE Transactions on Neural Networks and Learning Systems
  • 2021
TLDR
A visual-tactile cross-modal learning framework is proposed for robotic material perception and is addressed in the lifelong learning setting, which is beneficial to incrementally improve the ability of robotic cross- modal material perception. Expand
Object Recognition Through Active Sensing Using a Multi-Fingered Robot Hand with 3D Tactile Sensors
This paper investigates tactile object recognition with relatively densely distributed force vector measurements and evaluates what kind of tactile information is beneficial for object recognition.Expand
Variable In-Hand Manipulations for Tactile-Driven Robot Hand via CNN-LSTM
TLDR
Two fingers of the Allegro hand are used, and each fingertip is anthropomorphically shaped and equipped not only with 6-axis force-torque (F/T) sensors, but also with uSkin tactile sensors, which provide 24 tri-axial measurements per fingertip. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 30 REFERENCES
Shape-independent hardness estimation using deep learning and a GelSight tactile sensor
TLDR
This work introduces a novel method for hardness estimation, based on the GelSight tactile sensor, and it is shown that the neural net model can estimate the hardness of objects with different shapes and hardness ranging from 8 to 87 in Shore 00 scale. Expand
Connecting Look and Feel: Associating the Visual and Tactile Properties of Physical Materials
TLDR
This work captures color and depth images of draped fabrics along with tactile data from a high-resolution touch sensor and seeks to associate the information from vision and touch by jointly training CNNs across the three modalities. Expand
Tactile-Object Recognition From Appearance Information
TLDR
This paper presents a bag-of-features framework that uses several tactile-image descriptors, some that are adapted from the vision domain and others that are novel, to estimate a probability distribution over object identity as an unknown object is explored. Expand
Sensing and Recognizing Surface Textures Using a GelSight Sensor
  • R. Li, E. Adelson
  • Computer Science
  • 2013 IEEE Conference on Computer Vision and Pattern Recognition
  • 2013
TLDR
A simple yet effective texture recognition system based on local binary patterns, and enhanced it by the use of a multi-scale pyramid and a Hellinger distance metric that suggests that the GelSight sensor can be useful for material recognition by robots. Expand
Robot-Aided Cloth Classification Using Depth Information and CNNs
TLDR
This system uses a robot arm to extract a garment and show it to a depth camera, using only depth images of a partial view of the garment as input, and a deep convolutional neural network has been trained to classify different types of garments. Expand
Multi-sensorial and explorative recognition of garments and their material properties in unconstrained environment
TLDR
This work uses a robot equipped with RGB-D, tactile, and photometric stereo sensors that interacts with the garment through a combination of different basic actions to recognize the manipulated garment's type, fabric pattern, and material. Expand
Classification of clothing using interactive perception
TLDR
A system for automatically extracting and classifying items in a pile of laundry using only visual sensors, which shows that, on average, classification rates using robot interaction are 59% higher than those that do not use interaction. Expand
Tactile identification of objects using Bayesian exploration
TLDR
The exploration algorithm was augmented with reinforcement learning whereby its internal representations of objects evolved according to its cumulative experience with them, allowing the algorithm to compensate for drift in the performance of the anthropomorphic robot hand and the ambient conditions of testing, improving accuracy while reducing the number of exploratory movements required to identify an object. Expand
Improved GelSight tactile sensor for measuring geometry and slip
TLDR
A new design of GelSight for robot gripper is described, using a Lambertian membrane and new illumination system, which gives greatly improved geometric accuracy while retaining the compact size. Expand
Tactile sensing in dexterous robot hands - Review
TLDR
Current state-of-the-art of manipulation and grasping applications that involve artificial sense of touch that involve algorithms and tactile feedback-based control systems that exploit signals from the sensors are reviewed. Expand
...
1
2
3
...