Connecting Look and Feel: Associating the Visual and Tactile Properties of Physical Materials

@article{Yuan2017ConnectingLA,
  title={Connecting Look and Feel: Associating the Visual and Tactile Properties of Physical Materials},
  author={Wenzhen Yuan and Shaoxiong Wang and Siyuan Dong and Edward H. Adelson},
  journal={2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
  year={2017},
  pages={4494-4502}
}
For machines to interact with the physical world, they must understand the physical properties of objects and materials they encounter. We use fabrics as an example of a deformable material with a rich set of mechanical properties. A thin flexible fabric, when draped, tends to look different from a heavy stiff fabric. It also feels different when touched. Using a collection of 118 fabric samples, we captured color and depth images of draped fabrics along with tactile data from a high-resolution… Expand
Teaching Cameras to Feel: Estimating Tactile Physical Properties of Surfaces From Images
TLDR
This work introduces the challenging task of estimating a set of tactile physical properties from visual information and develops a cross-modal framework comprised of an adversarial objective and a novel visuo-tactile joint classification loss. Expand
Active Clothing Material Perception Using Tactile Sensing and Deep Learning
TLDR
This work proposes a new framework for active tactile perception system with vision-touch system, and has potential to enable robots to help humans with varied clothing related housework. Expand
Deep Visuo-Tactile Learning: Estimation of Tactile Properties from Images
TLDR
A model to estimate the degree of tactile properties from visual perception alone (e.g., the level of slipperiness or roughness) is proposed, which extends a encoder-decoder network, in which the latent variables are visual and tactile features. Expand
Deep Visuo-Tactile Learning: Estimation of Material Properties from Images
TLDR
A method for deep visuo-tactile learning in which a encoder-decoder network is trained with an intermediate layer in an unsupervised manner with images as input and tactile sequences as output, which shows that features are indeed expressed continuously, and that the method is able to handle unknown objects in its feature space. Expand
Deep Visuo-Tactile Learning: Estimation of Tactile Properties from Images (Extended Abstract)
TLDR
A model to estimate the degree of tactile properties from visual perception alone (e.g., the level of slipperiness or roughness) is proposed, which extends an encoderdecoder network, in which the latent variables are visual and tactile features. Expand
ViTac: Feature Sharing Between Vision and Tactile Sensing for Cloth Texture Recognition
TLDR
This paper proposes a new fusion method named Deep Maximum Covariance Analysis (DMCA) to learn a joint latent space for sharing features through vision and tactile sensing and finds that the perception performance of either vision or tactile sensing can be improved by employing the shared representation space, compared to learning from unimodal data. Expand
From Pixels to Percepts: Highly Robust Edge Perception and Contour Following Using Deep Learning and an Optical Biomimetic Tactile Sensor
TLDR
This letter applies deep learning to an optical biomimetic tactile sensor, the TacTip, which images an array of papillae inside its sensing surface analogous to structures within human skin, to show that the application of a deep convolutional neural network can give reliable edge perception and, thus, a robust policy for planning contact points to move around object contours. Expand
Connecting Touch and Vision via Cross-Modal Prediction
TLDR
This work investigates the cross-modal connection between vision and touch with a new conditional adversarial model that incorporates the scale and location information of the touch and demonstrates that the model can produce realistic visual images from tactile data and vice versa. Expand
Dynamic Modeling of Hand-Object Interactions via Tactile Sensing
TLDR
This work takes a step on dynamics modeling in hand-object interactions from dense tactile sensing, which opens the door for future applications in activity learning, humancomputer interactions, and imitation learning for robotics. Expand
Learning Intuitive Physics with Multimodal Generative Models
TLDR
This paper presents a perception framework that fuses visual and tactile feedback to make predictions about the expected motion of objects in dynamic scenes, using a novel See-Through-your-Skin sensor that provides high resolution multimodal sensing of contact surfaces. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 29 REFERENCES
Can you see what you feel? Color and folding properties affect visual-tactile material discrimination of fabrics.
TLDR
Together, using a visual-tactile matching task, it is shown that humans use folding and color information in matching the visual and tactile properties of fabrics. Expand
Estimating the Material Properties of Fabric from Video
TLDR
A framework to automatically analyze videos of fabrics moving under various unknown wind forces, and recover two key material properties of the fabric: stiffness and area weight is presented. Expand
Joint embeddings of shapes and images via CNN image purification
TLDR
A joint embedding space populated by both 3D shapes and 2D images of objects, where the distances between embedded entities reflect similarity between the underlying objects, which facilitates comparison between entities of either form, and allows for cross-modality retrieval. Expand
Retrographic sensing for the measurement of surface texture and shape
We describe a novel device that can be used as a 2.5D “scanner” for acquiring surface texture and shape. The device consists of a slab of clear elastomer covered with a reflective skin. When anExpand
Visual perception of materials and their properties
TLDR
A general theory of material perception is suggested, in which it is suggested that the visual system does not actually estimate physical parameters of materials and objects, but the brain is remarkably adept at building 'statistical generative models' that capture the natural degrees of variation in appearance between samples. Expand
Sensing and Recognizing Surface Textures Using a GelSight Sensor
  • R. Li, E. Adelson
  • Computer Science
  • 2013 IEEE Conference on Computer Vision and Pattern Recognition
  • 2013
TLDR
A simple yet effective texture recognition system based on local binary patterns, and enhanced it by the use of a multi-scale pyramid and a Hellinger distance metric that suggests that the GelSight sensor can be useful for material recognition by robots. Expand
A Two-Level Generative Model for Cloth Representation and Shape from Shading
  • Feng Han, Song-Chun Zhu
  • Mathematics, Computer Science
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • 2007
TLDR
This study is an attempt to revisit Marr's idea of computing the 2frac12D sketch from primal sketch based on a two-level generative sketch representation which is applicable to general shape-from-X problems. Expand
Estimating object hardness with a GelSight touch sensor
TLDR
A novel method of hardness sensing that does not require accurate control of contact conditions is described and a robot is able to more easily infer the hardness of the touched objects, thereby improving its object recognition as well as manipulation strategy. Expand
On seeing stuff: the perception of materials by humans and machines
  • E. Adelson
  • Computer Science, Engineering
  • IS&T/SPIE Electronic Imaging
  • 2001
The perception of objects is a well-developed field, but the perception of materials has been studied rather little. This is surprising given how important materials are for humans, and how importantExpand
Capturing and animating occluded cloth
TLDR
The shape of moving cloth is captured using a custom set of color markers printed on the surface of the cloth using a novel data driven hole-filling technique to fill occluded regions. Expand
...
1
2
3
...