VisuoSpatial Foresight for Multi-Step, Multi-Task Fabric Manipulation

  title={VisuoSpatial Foresight for Multi-Step, Multi-Task Fabric Manipulation},
  author={Ryan Hoque and Daniel Seita and A. Balakrishna and Aditya Ganapathi and A. Tanwani and N. Jamali and K. Yamane and Soshi Iba and Ken Goldberg},
  • Ryan Hoque, Daniel Seita, +6 authors Ken Goldberg
  • Published 2020
  • Computer Science, Engineering
  • ArXiv
  • Robotic fabric manipulation has applications in cloth and cable management, senior care, surgery and more. Existing fabric manipulation techniques, however, are designed for specific tasks, making it difficult to generalize across different but related tasks. We address this problem by extending the recently proposed Visual Foresight framework to learn fabric dynamics, which can be efficiently reused to accomplish a variety of different fabric manipulation tasks with a single goal-conditioned… CONTINUE READING

    Figures, Tables, and Topics from this paper.


    Publications referenced by this paper.
    Deep Imitation Learning of Sequential Fabric Smoothing Policies
    • 10
    • PDF
    Learning to Manipulate Deformable Objects without Demonstrations
    • 11
    • PDF
    Deep Transfer Learning of Pick Points on Fabric for Robot Bed-Making.
    • 16
    • PDF
    Cloth Manipulation Using Random-Forest-Based Imitation Learning
    • 5
    • PDF
    Dynamic Cloth Manipulation with Deep Reinforcement Learning
    • 5
    • PDF
    Domain randomization for transferring deep neural networks from simulation to the real world
    • 806
    • PDF
    Few-Shot Goal Inference for Visuomotor Learning and Planning
    • 23
    • PDF
    A geometric approach to robotic laundry folding
    • 163
    • PDF
    Deep visual foresight for planning robot motion
    • 357
    • Highly Influential
    • PDF
    Manipulating Highly Deformable Materials Using a Visual Feedback Dictionary
    • 13
    • PDF