• Corpus ID: 240070284

Disentangled generative models for robust dynamical system prediction

  title={Disentangled generative models for robust dynamical system prediction},
  author={Stathi Fotiadis and Shunlong Hu and Mario Lino and Chris D. Cantwell and Anil Anthony Bharath},
Deep neural networks have become increasingly of interest in dynamical system prediction, but out-of-distribution generalization and long-term stability still remains challenging. In this work, we treat the domain parameters of dynamical systems as factors of variation of the data generating process. By leveraging ideas from supervised disentanglement and causal factorization, we aim to separate the domain parameters from the dynamics in the latent space of generative models. In our experiments… 



Learning Latent Dynamics for Planning from Pixels

The Deep Planning Network (PlaNet) is proposed, a purely model-based agent that learns the environment dynamics from images and chooses actions through fast online planning in latent space using a latent dynamics model with both deterministic and stochastic transition components.

Joint Parameter Discovery and Generative Modeling of Dynamic Systems

A neural framework for estimating physical parameters in a manner consistent with the underlying physics, using a deep latent variable model to disentangle the system’s physical parameters from canonical coordinate observations and returns a Hamiltonian parameterization that generalizes well with respect to the discovered physical parameters.

beta-VAE: Learning Basic Visual Concepts with a Constrained Variational Framework

Learning an interpretable factorised representation of the independent data generative factors of the world without supervision is an important precursor for the development of artificial

Comparing recurrent and convolutional neural networks for predicting wave propagation

This work improves on the long-term prediction over previous methods while keeping the inference time at a fraction of numerical simulations, and shows that convolutional networks perform at least as well as recurrent networks in this task.

Clockwork Variational Autoencoders

This work introduces the Clockwork VAE (CW-VAE), a video prediction model that leverages a hierarchy of latent sequences, where higher levels tick at slower intervals, and confirms that slower levels learn to represent objects that change more slowly in the video, and faster levels learning to represent faster objects.

Machine Learning for Fluid Mechanics

Fundamental ML methodologies are outlined and their uses for understanding, modeling, optimizing, and controlling fluid flows are discussed and the strengths and limitations of these methods are addressed.

Challenging Common Assumptions in the Unsupervised Learning of Disentangled Representations

This paper theoretically shows that the unsupervised learning of disentangled representations is fundamentally impossible without inductive biases on both the models and the data, and trains more than 12000 models covering most prominent methods and evaluation metrics on seven different data sets.

Auto-Encoding Variational Bayes

A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.

WeatherBench Probability: Medium-range weather forecasts with probabilistic machine learning methods.

This work presents an extension to the WeatherBench, a benchmark dataset for medium-range, data-driven weather prediction, that adds a set of commonly used probabilistic verification metrics: the spread-skill ratio, the continuous ranked probability score (CRPS) and rank histograms.

Skilful precipitation nowcasting using deep generative models of radar

High-resolution forecasts of rainfall and hydrometeors zero to two hours in the future, known as precipitation nowcasting, are crucial for weather-dependent decision-making and must provide accurate predictions across multiple spatial and temporal scales.