Corpus ID: 235446859

Learning effective stochastic differential equations from microscopic simulations: combining stochastic numerics and deep learning

@article{Dietrich2021LearningES,
  title={Learning effective stochastic differential equations from microscopic simulations: combining stochastic numerics and deep learning},
  author={Felix Dietrich and Alexei Makeev and George Kevrekidis and Nikolaos Evangelou and Tom S. Bertalan and Sebastian Reich and Ioannis G. Kevrekidis},
  journal={ArXiv},
  year={2021},
  volume={abs/2106.09004}
}
We identify effective stochastic differential equations (SDE) for coarse observables of fine-grained particleor agent-based simulations; these SDE then provide coarse surrogate models of the fine scale dynamics. We approximate the drift and diffusivity functions in these effective SDE through neural networks, which can be thought of as effective stochastic ResNets. The loss function is inspired by, and embodies, the structure of established stochastic numerical integrators (here, Euler-Maruyama… Expand
Learning the temporal evolution of multivariate densities via normalizing flows
TLDR
This work proposes a method to learn probability distributions using sample path data from stochastic differential equations and demonstrates that this approach can learn solutions to non-local Fokker-Planck equations, such as those arising in systems driven by both Brownian and Lévy noise. Expand
Extracting Stochastic Governing Laws by Nonlocal Kramers-Moyal Formulas
  • Yubin Lu, Yang Li, Jinqiao Duan
  • Mathematics
  • 2021
With the rapid development of computational techniques and scientific tools, great progress of data-driven analysis has been made to extract governing laws of dynamical systems from data. Despite theExpand
GFINNs: GENERIC Formalism Informed Neural Networks for Deterministic and Stochastic Dynamical Systems
  • Zhen Zhang, Yeonjong Shin, G. Karniadakis
  • Computer Science, Mathematics
  • ArXiv
  • 2021
We propose the GENERIC formalism informed neural networks (GFINNs) that obey the symmetric degeneracy conditions of the GENERIC formalism. GFINNs comprise two modules, each of which contains twoExpand

References

SHOWING 1-10 OF 56 REFERENCES
Generative Ensemble-Regression: Learning Stochastic Dynamics from Discrete Particle Ensemble Observations
TLDR
A new method for inferring the governing stochastic ordinary differential equations by observing particle ensembles at discrete and sparse time instants, i.e., multiple "snapshots" is proposed, in analogy to the classic "point-regression", where the dynamics are inferred by performing regression in the Euclidean space. Expand
RODE-Net: Learning Ordinary Differential Equations with Randomness from Data
TLDR
Numerical results show that the proposed RODE-Net can well estimate the distribution of model parameters using simulated data and can make reliable predictions. Expand
Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators
TLDR
A new deep neural network called DeepONet can lean various mathematical operators with small generalization error and can learn various explicit operators, such as integrals and fractional Laplacians, as well as implicit operators that represent deterministic and stochastic differential equations. Expand
Neural Ordinary Differential Equations
TLDR
This work shows how to scalably backpropagate through any ODE solver, without access to its internal operations, which allows end-to-end training of ODEs within larger models. Expand
On learning Hamiltonian systems from data.
TLDR
This work proposes to explore a particular type of underlying structure in the data: Hamiltonian systems, where an "energy" is conserved, and extracts an underlying phase space as well as the generating Hamiltonian from a collection of movies of a pendulum. Expand
LEARNING STOCHASTIC DIFFERENTIAL EQUATIONS WITH GAUSSIAN PROCESSES WITHOUT GRADIENT MATCHING
TLDR
A novel paradigm for learning non-parametric drift and diffusion functions for stochastic differential equation (SDE) systems learns to simulate path distributions that match observations with non-uniform time increments and arbitrary sparseness. Expand
Numerical methods for strong solutions of stochastic differential equations: an overview
  • K. Burrage, P. Burrage, T. Tian
  • Mathematics
  • Proceedings of the Royal Society of London. Series A: Mathematical, Physical and Engineering Sciences
  • 2004
This paper gives a review of recent progress in the design of numerical methods for computing the trajectories (sample paths) of solutions to stochastic differential equations. We give a brief surveyExpand
Learning Poisson systems and trajectories of autonomous systems via Poisson neural networks
TLDR
This work demonstrates through several simulations that PNNs are capable of handling very accurately several challenging tasks, including the motion of a particle in the electromagnetic potential, the nonlinear Schr{\"o}dinger equation, and pixel observations of the two-body problem. Expand
Auto-Encoding Variational Bayes
TLDR
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced. Expand
Parsimonious Representation of Nonlinear Dynamical Systems Through Manifold Learning: A Chemotaxis Case Study
Nonlinear manifold learning algorithms, such as diffusion maps, have been fruitfully applied in recent years to the analysis of large and complex data sets. However, such algorithms still encounterExpand
...
1
2
3
4
5
...