TactileSGNet: A Spiking Graph Neural Network for Event-based Tactile Object Recognition

@article{Gu2020TactileSGNetAS,
  title={TactileSGNet: A Spiking Graph Neural Network for Event-based Tactile Object Recognition},
  author={Fuqiang Gu and Weicong Sng and Tasbolat Taunyazov and Harold Soh},
  journal={2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
  year={2020},
  pages={9876-9882}
}
  • Fuqiang Gu, Weicong Sng, +1 author Harold Soh
  • Published 1 August 2020
  • Computer Science, Engineering
  • 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
Tactile perception is crucial for a variety of robot tasks including grasping and in-hand manipulation. New advances in flexible, event-driven, electronic skins may soon endow robots with touch perception capabilities similar to humans. These electronic skins respond asynchronously to changes (e.g., in pressure, temperature), and can be laid out irregularly on the robot’s body or end-effector. However, these unique features may render current deep learning approaches such as convolutional… 
Exploiting Spiking Dynamics with Spatial-temporal Feature Normalization in Graph Learning
TLDR
A general spike-based modeling framework that enables the direct training of SNNs for graph learning through spatial-temporal unfolding for spiking data flows of node features, and incorporates graph convolution filters into spiking dynamics and formalizes a synergistic learning paradigm.
What Robot do I Need? Fast Co-Adaptation of Morphology and Control using Graph Neural Networks
TLDR
Evaluations in simulation show that the new method can co-adapt agents within such a limited number of production cycles by efficiently combining design optimization with offline reinforcement learning, that it allows for the direct application to real-world co- Adaptation tasks in future work.
EventDrop: data augmentation for event-based learning
TLDR
This paper introduces EventDrop, a new method for augmenting asynchronous event data to improve the generalization of deep models and is able to increase the diversity of training data by dropping events selected with various strategies.

References

SHOWING 1-10 OF 39 REFERENCES
Event-Driven Visual-Tactile Sensing and Learning for Robots
TLDR
This work contributes an event-driven visual-tactile perception system, comprising a novel biologically-inspired tactile sensor and multi-modal spike-based learning, and the Visual-Tactile Spiking Neural Network (VT-SNN), which enables fast perception when coupled with event sensors.
TactileGCN: A Graph Convolutional Network for Predicting Grasp Stability with Tactile Sensors
TLDR
This work explores an alternative way of exploiting tactile information to predict grasp stability by leveraging graph-like representations of tactile data, which preserve the actual spatial arrangement of the sensor’s taxels and their locality.
More Than a Feeling: Learning to Grasp and Regrasp Using Vision and Touch
TLDR
An end-to-end action-conditional model that learns regrasping policies from raw visuo-tactile data and outperforms a variety of baselines at estimating grasp adjustment outcomes, selecting efficient grasp adjustments for quick grasping, and reducing the amount of force applied at the fingers, while maintaining competitive performance.
Towards Effective Tactile Identification of Textures using a Hybrid Touch Approach
TLDR
This paper develops three machine-learning methods within a framework to discriminate between surface textures and shows that a good initial estimate can be obtained via touch data, which can be further refined via sliding; combining both touch and sliding data results in 98% classification accuracy over unseen test data.
Training Deep Spiking Neural Networks Using Backpropagation
TLDR
A novel technique is introduced, which treats the membrane potentials of spiking neurons as differentiable signals, where discontinuities at spike times are considered as noise, which enables an error backpropagation mechanism for deep SNNs that follows the same principles as in conventional deep networks, but works directly on spike signals and membranes potentials.
SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks
TLDR
SuperSpike is derived, a nonlinear voltage-based three-factor learning rule capable of training multilayer networks of deterministic integrate-and-fire neurons to perform nonlinear computations on spatiotemporal spike patterns.
Deep Learning With Spiking Neurons: Opportunities and Challenges
TLDR
This review addresses the opportunities that deep spiking networks offer and investigates in detail the challenges associated with training SNNs in a way that makes them competitive with conventional deep learning, but simultaneously allows for efficient mapping to hardware.
Unsupervised learning of digit recognition using spike-timing-dependent plasticity
TLDR
A SNN for digit recognition which is based on mechanisms with increased biological plausibility, i.e., conductance-based instead of current-based synapses, spike-timing-dependent plasticity with time-dependent weight change, lateral inhibition, and an adaptive spiking threshold is presented.
SLAYER: Spike Layer Error Reassignment in Time
TLDR
A new general back Propagation mechanism for learning synaptic weights and axonal delays which overcomes the problem of non-differentiability of the spike function and uses a temporal credit assignment policy for backpropagating error to preceding layers is introduced.
A neuro-inspired artificial peripheral nervous system for scalable electronic skins
TLDR
The Asynchronously Coded Electronic Skin (ACES) is introduced—a neuromimetic architecture that enables simultaneous transmission of thermotactile information while maintaining exceptionally low readout latencies, even with array sizes beyond 10,000 sensors.
...
1
2
3
4
...