# Variational Autoencoders for Learning Nonlinear Dynamics of Physical Systems

@article{Lopez2020VariationalAF, title={Variational Autoencoders for Learning Nonlinear Dynamics of Physical Systems}, author={Ryan Lopez and Paul J. Atzberger}, journal={ArXiv}, year={2020}, volume={abs/2012.03448} }

We develop data-driven methods for incorporating physical information for priors to learn parsimonious representations of nonlinear systems arising from parameterized PDEs and mechanics. Our approach is based on Variational Autoencoders (VAEs) for learning from observations nonlinear state space models. We develop ways to incorporate geometric and topological priors through general manifold latent space representations. We investigate the performance of our methods for learning low dimensional…

## 4 Citations

### $\Phi$-DVAE: Learning Physically Interpretable Representations with Nonlinear Filtering

- Computer Science
- 2022

Incorporating unstructured data into physical models is a challenging problem that is emerging in data assimilation. Traditional approaches focus on well-deﬁned observation operators whose functional…

### GD-VAEs: Geometric Dynamic Variational Autoencoders for Learning Nonlinear Dynamics and Dimension Reductions

- Computer Science
- 2022

The performance of the methods referred to as GD-VAEs are investigated on tasks for learning low dimensional representations of the nonlinear Burgers equations, constrained mechanical systems, and spatial fields of reactiondiffusion systems.

### Disentangling Generative Factors of Physical Fields Using Variational Autoencoders

- Computer ScienceFrontiers in Physics
- 2022

This work explores the use of variational autoencoders for non-linear dimension reduction with the specific aim of disentangling the low-dimensional latent variables to identify independent physical parameters that generated the data.

### MLMOD Package: Machine Learning Methods for Data-Driven Modeling in LAMMPS

- Computer ScienceArXiv
- 2021

The prototype C++ package for incorporating into simulations models obtained from machine learning methods using general model classes including Neural Networks, Gaussian Process Regression, Kernel Models, and other approaches is discussed.

## References

SHOWING 1-10 OF 83 REFERENCES

### Model reduction of dynamical systems on nonlinear manifolds using deep convolutional autoencoders

- Computer ScienceJ. Comput. Phys.
- 2020

### Deep learning for universal linear embeddings of nonlinear dynamics

- Computer ScienceNature Communications
- 2018

It is often advantageous to transform a strongly nonlinear system into a linear one in order to simplify its analysis for prediction and control, so the authors combine dynamical systems with deep learning to identify these hard-to-find transformations.

### Variational encoding of complex dynamics.

- Computer SciencePhysical review. E
- 2018

The use of a time-lagged VAE, or variational dynamics encoder (VDE), to reduce complex, nonlinear processes to a single embedding with high fidelity to the underlying dynamics and how the VDE is able to capture nontrivial dynamics in a variety of examples.

### Physics-informed deep generative models

- Computer ScienceArXiv
- 2018

An implicit variational inference formulation that constrains the generative model output to satisfy given physical laws expressed by partial differential equations provide a regularization mechanism for effectively training deep probabilistic models for modeling physical systems in which the cost of data acquisition is high and training data-sets are typically small.

### Structured Inference Networks for Nonlinear State Space Models

- Computer ScienceAAAI
- 2017

A unified algorithm is introduced to efficiently learn a broad class of linear and non-linear state space models, including variants where the emission and transition distributions are modeled by deep neural networks.

### Deep learning of dynamics and signal-noise decomposition with time-stepping constraints

- Computer ScienceJ. Comput. Phys.
- 2019

### Physics-informed Autoencoders for Lyapunov-stable Fluid Flow Prediction

- Computer ScienceArXiv
- 2019

This work investigates whether it is possible to include physics-informed prior knowledge for improving the model quality, and focuses on the stability of an equilibrium, one of the most basic properties a dynamic system can have, via the lens of Lyapunov analysis.

### Efficient Amortised Bayesian Inference for Hierarchical and Nonlinear Dynamical Systems

- Computer ScienceICML
- 2019

A flexible, scalable Bayesian inference framework for nonlinear dynamical systems characterised by distinct and hierarchical variability at the individual, group, and population levels, and empirically validate the method by predicting the dynamic behaviour of bacteria that were genetically engineered to function as biosensors.

### Data-driven recovery of hidden physics in reduced order modeling of fluid flows

- Computer SciencePhysics of Fluids
- 2020

A modular hybrid analysis and modeling approach to account for hidden physics in reduced order modeling of parameterized systems relevant to fluid dynamics provides insights addressing a fundamental limitation of the physics-based models when the governing equations are incomplete to represent underlying physical processes.

### Variational Autoencoders with Riemannian Brownian Motion Priors

- Computer ScienceICML
- 2020

This work assumes a Riemannian structure over the latent space, which constitutes a more principled geometric view of the latent codes, and replaces the standard Gaussian prior with a R Siemannian Brownian motion prior, and demonstrates that this prior significantly increases model capacity using only one additional scalar parameter.