Corpus ID: 1731857

Embed to Control: A Locally Linear Latent Dynamics Model for Control from Raw Images

@inproceedings{Watter2015EmbedTC,
  title={Embed to Control: A Locally Linear Latent Dynamics Model for Control from Raw Images},
  author={Manuel Watter and Jost Tobias Springenberg and J. Boedecker and Martin A. Riedmiller},
  booktitle={NIPS},
  year={2015}
}
We introduce Embed to Control (E2C), a method for model learning and control of non-linear dynamical systems from raw pixel images. E2C consists of a deep generative model, belonging to the family of variational autoencoders, that learns to generate image trajectories from a latent space in which the dynamics is constrained to be locally linear. Our model is derived directly from an optimal control formulation in latent space, supports long-term prediction of image sequences and exhibits strong… Expand
Dynamic Variational Autoencoders for Visual Process Modeling
  • A. Sagel, Hao Shen
  • Computer Science
  • ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2020
Robust Locally-Linear Controllable Embedding
Prediction, Consistency, Curvature: Representation Learning for Locally-Linear Control
Predictive Coding for Locally-Linear Control
No Representation without Transformation
Video Extrapolation with an Invertible Linear Embedding
Extracting Latent State Representations with Linear Dynamics from Rich Observations
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 54 REFERENCES
From Pixels to Torques: Policy Learning with Deep Dynamical Models
NICE: Non-linear Independent Components Estimation
Deep AutoRegressive Networks
DRAW: A Recurrent Neural Network For Image Generation
Deep auto-encoder neural networks in reinforcement learning
Auto-Encoding Variational Bayes
Learning of Non-Parametric Control Policies with High-Dimensional State Features
Learning Stochastic Recurrent Networks
...
1
2
3
4
5
...