• Corpus ID: 243985811

SyMetric: Measuring the Quality of Learnt Hamiltonian Dynamics Inferred from Vision

@inproceedings{Higgins2021SyMetricMT,
  title={SyMetric: Measuring the Quality of Learnt Hamiltonian Dynamics Inferred from Vision},
  author={Irina Higgins and Peter Wirnsberger and Andrew Jaegle and Aleksandar Botev},
  booktitle={NeurIPS},
  year={2021}
}
A recently proposed class of models attempts to learn latent dynamics from high-dimensional observations, like images, using priors informed by Hamiltonian mechanics. While these models have important potential applications in areas like robotics or autonomous driving, there is currently no good way to evaluate their performance: existing methods primarily rely on image reconstruction quality, which does not always reflect the quality of the learnt latent dynamics. In this work, we empirically… 

Figures and Tables from this paper

Continuous MDP Homomorphisms and Homomorphic Policy Gradient

TLDR
It is rigorously proved that performing HPG on the abstract MDP is equivalent to performing the deterministic policy gradient (DPG) on the actual MDP, and it is proved that continuous MDP homomorphisms preserve value functions, which in turn enables their use for policy evaluation.

Symmetry-Based Representations for Artificial and Biological General Intelligence

TLDR
It is argued that symmetry transformations are a fundamental principle that can guide the search for what makes a good representation, and may be an important general framework that determines the structure of the universe, constrains the nature of natural tasks and consequently shapes both biological and artificial intelligence.

References

SHOWING 1-10 OF 56 REFERENCES

Which priors matter? Benchmarking models for learning latent dynamics

TLDR
This work introduces a suite of 17 datasets with visual observations based on physical systems exhibiting a wide range of dynamics, and finds that the use of continuous and time-reversible dynamics benefits models of all classes.

Unsupervised Learning of Lagrangian Dynamics from Images for Prediction and Control

TLDR
A new unsupervised neural network model is introduced that learns Lagrangian dynamics from images, with interpretability that benefits prediction and control and enables long-term prediction of dynamics in the image space and synthesis of energy-based controllers.

Symplectic ODE-Net: Learning Hamiltonian Dynamics with Control

TLDR
This paper introduces Symplectic ODE-Net, a deep learning framework which can infer the dynamics of a physical system, given by an ordinary differential equation (ODE), from observed state trajectories and proposes a parametrization which can enforce this Hamiltonian formalism even when the generalized coordinate data is embedded in a high-dimensional space or the authors can only access velocity data instead of generalized momentum.

Deep Lagrangian Networks: Using Physics as Model Prior for Deep Learning

TLDR
The proposed DeLaN network can learn the equations of motion of a mechanical system with a deep network efficiently while ensuring physical plausibility and exhibits substantially improved and more robust extrapolation to novel trajectories and learns online in real-time.

Variational integrator graph networks for learning energy-conserving dynamical systems.

TLDR
It is demonstrated, across an extensive ablation, that the proposed unifying framework outperforms existing methods, for data-efficient learning and in predictive accuracy, across both single- and many-body problems studied in the recent literature.

Hamiltonian Generative Networks

TLDR
This work introduces the Hamiltonian Generative Network (HGN), the first approach capable of consistently learning Hamiltonian dynamics from high-dimensional observations (such as images) without restrictive domain assumptions, and demonstrates how a simple modification of the network architecture turns HGN into a powerful normalising flow model, called Neural Hamiltonian Flow (NHF), that usesHamiltonian dynamics to model expressive densities.

Nonseparable Symplectic Neural Networks

TLDR
A novel neural network architecture is proposed, Nonseparable Symplectic Neural Networks (NSSNNs), to uncover and embed the symplectic structure of a nonseparable Hamiltonian system from limited observation data and show the unique computational merits of the approach to yield long-term, accurate, and robust predictions for large-scale Hamiltonian systems by rigorously enforcing symplectomorphism.

beta-VAE: Learning Basic Visual Concepts with a Constrained Variational Framework

Learning an interpretable factorised representation of the independent data generative factors of the world without supervision is an important precursor for the development of artificial

Variational Integrator Networks for Physically Structured Embeddings

TLDR
This work proposesvariational integrator networks, a class of neural network architectures designed to preserve the geometric structure of physical systems that can accurately learn dynamical systems from both noisy observations in phase space and from image pixels within which the unknown dynamics are embedded.

Lagrangian Neural Networks

TLDR
LNNs are proposed, which can parameterize arbitrary Lagrangians using neural networks, and do not require canonical coordinates, and thus perform well in situations where canonical momenta are unknown or difficult to compute.
...