• Corpus ID: 227334317

Variational Autoencoders for Learning Nonlinear Dynamics of Physical Systems

  title={Variational Autoencoders for Learning Nonlinear Dynamics of Physical Systems},
  author={Ryan Lopez and Paul J. Atzberger},
We develop data-driven methods for incorporating physical information for priors to learn parsimonious representations of nonlinear systems arising from parameterized PDEs and mechanics. Our approach is based on Variational Autoencoders (VAEs) for learning from observations nonlinear state space models. We develop ways to incorporate geometric and topological priors through general manifold latent space representations. We investigate the performance of our methods for learning low dimensional… 

Figures and Tables from this paper

$\Phi$-DVAE: Learning Physically Interpretable Representations with Nonlinear Filtering

Incorporating unstructured data into physical models is a challenging problem that is emerging in data assimilation. Traditional approaches focus on well-defined observation operators whose functional

GD-VAEs: Geometric Dynamic Variational Autoencoders for Learning Nonlinear Dynamics and Dimension Reductions

The performance of the methods referred to as GD-VAEs are investigated on tasks for learning low dimensional representations of the nonlinear Burgers equations, constrained mechanical systems, and spatial fields of reactiondiffusion systems.

Disentangling Generative Factors of Physical Fields Using Variational Autoencoders

This work explores the use of variational autoencoders for non-linear dimension reduction with the specific aim of disentangling the low-dimensional latent variables to identify independent physical parameters that generated the data.

MLMOD Package: Machine Learning Methods for Data-Driven Modeling in LAMMPS

The prototype C++ package for incorporating into simulations models obtained from machine learning methods using general model classes including Neural Networks, Gaussian Process Regression, Kernel Models, and other approaches is discussed.



Model reduction of dynamical systems on nonlinear manifolds using deep convolutional autoencoders

Deep learning for universal linear embeddings of nonlinear dynamics

It is often advantageous to transform a strongly nonlinear system into a linear one in order to simplify its analysis for prediction and control, so the authors combine dynamical systems with deep learning to identify these hard-to-find transformations.

Variational encoding of complex dynamics.

The use of a time-lagged VAE, or variational dynamics encoder (VDE), to reduce complex, nonlinear processes to a single embedding with high fidelity to the underlying dynamics and how the VDE is able to capture nontrivial dynamics in a variety of examples.

Physics-informed deep generative models

An implicit variational inference formulation that constrains the generative model output to satisfy given physical laws expressed by partial differential equations provide a regularization mechanism for effectively training deep probabilistic models for modeling physical systems in which the cost of data acquisition is high and training data-sets are typically small.

Structured Inference Networks for Nonlinear State Space Models

A unified algorithm is introduced to efficiently learn a broad class of linear and non-linear state space models, including variants where the emission and transition distributions are modeled by deep neural networks.

Physics-informed Autoencoders for Lyapunov-stable Fluid Flow Prediction

This work investigates whether it is possible to include physics-informed prior knowledge for improving the model quality, and focuses on the stability of an equilibrium, one of the most basic properties a dynamic system can have, via the lens of Lyapunov analysis.

Efficient Amortised Bayesian Inference for Hierarchical and Nonlinear Dynamical Systems

A flexible, scalable Bayesian inference framework for nonlinear dynamical systems characterised by distinct and hierarchical variability at the individual, group, and population levels, and empirically validate the method by predicting the dynamic behaviour of bacteria that were genetically engineered to function as biosensors.

Data-driven recovery of hidden physics in reduced order modeling of fluid flows

A modular hybrid analysis and modeling approach to account for hidden physics in reduced order modeling of parameterized systems relevant to fluid dynamics provides insights addressing a fundamental limitation of the physics-based models when the governing equations are incomplete to represent underlying physical processes.

Variational Autoencoders with Riemannian Brownian Motion Priors

This work assumes a Riemannian structure over the latent space, which constitutes a more principled geometric view of the latent codes, and replaces the standard Gaussian prior with a R Siemannian Brownian motion prior, and demonstrates that this prior significantly increases model capacity using only one additional scalar parameter.