Corpus ID: 198179864

Multisensory Learning Framework for Robot Drumming

@article{Barsky2019MultisensoryLF,
  title={Multisensory Learning Framework for Robot Drumming},
  author={A. Barsky and C. Zito and H. Mori and T. Ogata and J. L. Wyatt},
  journal={ArXiv},
  year={2019},
  volume={abs/1907.09775}
}
The hype about sensorimotor learning is currently reaching high fever, thanks to the latest advancement in deep learning. In this paper, we present an open-source framework for collecting large-scale, time-synchronised synthetic data from highly disparate sensory modalities, such as audio, video, and proprioception, for learning robot manipulation tasks. We demonstrate the learning of non-linear sensorimotor mappings for a humanoid drumming robot that generates novel motion sequences from… Expand
3 Citations
Grasping and Manipulation with a Multi-Fingered Hand
  • 1
  • PDF
Thesis Proposal

References

SHOWING 1-4 OF 4 REFERENCES
Multimodal integration learning of robot behavior using deep neural networks
  • 102
A Self-Synthesis Approach to Perceptual Learning for Multisensory Fusion in Robotics
  • 10
  • PDF
Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection
  • 1,125
  • PDF
Multimodal Deep Learning
  • 2,133
  • PDF