Corpus ID: 236493602

Learning the temporal evolution of multivariate densities via normalizing flows

@article{Lu2021LearningTT,
  title={Learning the temporal evolution of multivariate densities via normalizing flows},
  author={Yubin Lu and Romit Maulik and Ting Gao and Felix Dietrich and Ioannis G. Kevrekidis and Jinqiao Duan},
  journal={ArXiv},
  year={2021},
  volume={abs/2107.13735}
}
  • Yubin Lu, R. Maulik, +3 authors Jinqiao Duan
  • Published 2021
  • Computer Science, Mathematics
  • ArXiv
In this work, we propose a method to learn probability distributions using sample path data from stochastic differential equations. Specifically, we consider temporally evolving probability distributions (e.g., those produced by integrating local or nonlocal FokkerPlanck equations). We analyze this evolution through machine learning assisted construction of a time-dependent mapping that takes a reference distribution (say, a Gaussian) to each and every instance of our evolving distribution. If… Expand
Extracting Stochastic Governing Laws by Nonlocal Kramers-Moyal Formulas
  • Yubin Lu, Yang Li, Jinqiao Duan
  • Mathematics
  • 2021
With the rapid development of computational techniques and scientific tools, great progress of data-driven analysis has been made to extract governing laws of dynamical systems from data. Despite theExpand
Extracting stochastic dynamical systems with $\alpha$-stable L\'evy noise from data
  • Yang Li, Yubin Lu, Shengyuan Xu, Jinqiao Duan
  • Mathematics, Computer Science
  • 2021
TLDR
This work proposes a data-driven method to extract stochastic dynamical systems with α-stable Lévy noise from short burst data based on the properties ofα-stable distributions and approximate the drift coefficient by combining nonlocal Kramers-Moyal formulas with normalizing flows. Expand

References

SHOWING 1-10 OF 65 REFERENCES
Generative Ensemble-Regression: Learning Stochastic Dynamics from Discrete Particle Ensemble Observations
TLDR
A new method for inferring the governing stochastic ordinary differential equations by observing particle ensembles at discrete and sparse time instants, i.e., multiple "snapshots" is proposed, in analogy to the classic "point-regression", where the dynamics are inferred by performing regression in the Euclidean space. Expand
Solving Inverse Stochastic Problems from Discrete Particle Observations Using the Fokker-Planck Equation and Physics-informed Neural Networks
TLDR
A general framework based on physics-informed neural networks (PINNs) that introduces a new loss function using the Kullback-Leibler divergence to connect the stochastic samples with the FP equation, to simultaneously learn the equation and infer the multi-dimensional PDF at all times. Expand
Neural Stochastic Differential Equations: Deep Latent Gaussian Models in the Diffusion Limit
TLDR
This work develops a variational inference framework for deep latent Gaussian models via stochastic automatic differentiation in Wiener space, where the variational approximations to the posterior are obtained by Girsanov (mean-shift) transformation of the standard Wiener process and the computation of gradients is based on the theory of Stochastic flows. Expand
Generator estimation of Markov jump processes
TLDR
The purpose of this paper is to compile a catalogue of existing approaches to estimate the generator of a continuous-time Markov jump process based on incomplete data, to compare the strengths and weaknesses, and to test their performance in a series of numerical examples. Expand
Learning effective stochastic differential equations from microscopic simulations: combining stochastic numerics and deep learning
TLDR
This work identifies effective stochastic differential equations for coarse observables of fine-grained particleor agent-based simulations and approximate the drift and diffusivity functions in these effective SDE through neural networks, which can be thought of as effective stoChastic ResNets. Expand
Nonparametric estimation of stochastic differential equations with sparse Gaussian processes.
TLDR
This paper introduces a nonparametric method for estimating the drift and diffusion terms of SDEs from a densely observed discrete time series using a sparse Gaussian process approximation. Expand
Estimating Long-Term Behavior of Flows without Trajectory Integration: The Infinitesimal Generator Approach
TLDR
This paper proposes to directly work with the infinitesimal generator instead of the operator, completely avoiding trajectory integration in the construction of an approximate transfer operator. Expand
Temporal Normalizing Flows
TLDR
This paper extends the concept of normalizing flows to so-called temporal Normalizing Flows (tNFs) to estimate time dependent distributions, leveraging the full spatio-temporal information present in the dataset. Expand
Neural Jump Stochastic Differential Equations
TLDR
This work introduces Neural Jump Stochastic Differential Equations that provide a data-driven approach to learn continuous and discrete dynamic behavior, i.e., hybrid systems that both flow and jump. Expand
DENSITY ESTIMATION BY DUAL ASCENT OF THE LOG-LIKELIHOOD ∗
A methodology is developed to assign, from an observed sample, a joint-probability distribution to a set of continuous variables. The algorithm proposed performs this assignment by mapping theExpand
...
1
2
3
4
5
...