• Corpus ID: 239015913

Physics-guided Deep Markov Models for Learning Nonlinear Dynamical Systems with Uncertainty

  title={Physics-guided Deep Markov Models for Learning Nonlinear Dynamical Systems with Uncertainty},
  author={Wei Liu and Zhi-Lu Lai and Kiran Bacsa and Eleni N. Chatzi},
  • Wei Liu, Zhi-Lu Lai, +1 author E. Chatzi
  • Published 16 October 2021
  • Computer Science, Physics, Mathematics
  • ArXiv
In this paper, we propose a probabilistic physics-guided framework, termed Physics-guided Deep Markov Model (PgDMM). The framework is especially targeted to the inference of the characteristics and latent structure of nonlinear dynamical systems from measurement data, where it is typically intractable to perform exact inference of latent variables. A recently surfaced option pertains to leveraging variational inference to perform approximate inference. In such a scheme, transition and emission… 


Structured Inference Networks for Nonlinear State Space Models
A unified algorithm is introduced to efficiently learn a broad class of linear and non-linear state space models, including variants where the emission and transition distributions are modeled by deep neural networks.
Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
Abstract We introduce physics-informed neural networks – neural networks that are trained to solve supervised learning tasks while respecting any given laws of physics described by general nonlinear
Deep Variational Bayes Filters: Unsupervised Learning of State Space Models from Raw Data
Deep Variational Bayes Filters is introduced, a new method for unsupervised learning and identification of latent Markovian state space models that can overcome intractable inference distributions via variational inference and enables realistic long-term prediction.
Stochastic Backpropagation and Approximate Inference in Deep Generative Models
We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised class of deep, directed generative models, endowed with a new algorithm for scalable inference and
Auto-Encoding Variational Bayes
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.
Deep Kalman Filters
A unified algorithm is introduced to efficiently learn a broad spectrum of Kalman filters and investigates the efficacy of temporal generative models for counterfactual inference, and introduces the "Healing MNIST" dataset where long-term structure, noise and actions are applied to sequences of digits.
A Disentangled Recognition and Nonlinear Dynamics Model for Unsupervised Learning
The Kalman variational auto-encoder is introduced, a framework for unsupervised learning of sequential data that disentangles two latent representations: an object's representation, coming from a recognition model, and a latent state describing its dynamics.
Structural identification with physics-informed neural ordinary differential equations
Abstract This paper exploits a new direction of structural identification by means of Neural Ordinary Differential Equations (Neural ODEs), particularly constrained by domain knowledge, such as
Learning Stochastic Recurrent Networks
The proposed model is a generalisation of deterministic recurrent neural networks with latent variables, resulting in Stochastic Recurrent Networks (STORNs), and is evaluated on four polyphonic musical data sets and motion capture data.
Probabilistic Graphical Models - Principles and Techniques
The framework of probabilistic graphical models, presented in this book, provides a general approach for causal reasoning and decision making under uncertainty, allowing interpretable models to be constructed and then manipulated by reasoning algorithms.