LEARNING STOCHASTIC DIFFERENTIAL EQUATIONS WITH GAUSSIAN PROCESSES WITHOUT GRADIENT MATCHING

@article{Yildiz2018LEARNINGSD,
  title={LEARNING STOCHASTIC DIFFERENTIAL EQUATIONS WITH GAUSSIAN PROCESSES WITHOUT GRADIENT MATCHING},
  author={Çagatay Yildiz and Markus Heinonen and Jukka Intosalmi and Henrik Mannerstr{\"o}m and Harri L{\"a}hdesm{\"a}ki},
  journal={2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP)},
  year={2018},
  pages={1-6}
}
We introduce a novel paradigm for learning non-parametric drift and diffusion functions for stochastic differential equation (SDE). The proposed model learns to simulate path distributions that match observations with non-uniform time increments and arbitrary sparseness, which is in contrast with gradient matching that does not optimize simulated responses. We formulate sensitivity equations for learning and demonstrate that our general stochastic distribution optimisation leads to robust and… 

Figures from this paper

Deep learning with differential Gaussian process flows
TLDR
A novel deep learning paradigm of differential flows that learn a stochastic differential equation transformations of inputs prior to a standard classification or regression function is proposed, demonstrating excellent results as compared to deep Gaussian processes and Bayesian neural networks.
Sparse Gaussian Processes for Stochastic Differential Equations
TLDR
An approximate (variational) inference algorithm is derived and a novel parameterization of the approximate distribution over paths using a sparse Markovian Gaussian process is proposed, allowing the usage of well-established optimizing algorithms such as natural gradient descent for better convergence.
Learning stochastic dynamical systems with neural networks mimicking the Euler-Maruyama scheme
TLDR
A data driven approach where parameters of the SDE are represented by a neural network with a built-in SDE integration scheme and the loss function is based on a maximum likelihood criterion, under order one Markov Gaussian assumptions.
Learning effective stochastic differential equations from microscopic simulations: linking stochastic numerics to deep learning
TLDR
This approach does not require long trajectories, works on scattered snapshot data, and is designed to naturally handle different time steps per snapshot, which lends themselves naturally to “physics-informed” gray-box identification when approximate coarse models, such as mean means equations, are available.
Infinite-dimensional optimization and Bayesian nonparametric learning of stochastic differential equations
The paper has two major themes. The first part of the paper establishes certain general results for infinite-dimensional optimization problems on Hilbert spaces. These results cover the classical
Monotonic Gaussian Process Flows
TLDR
A nonparametric model of monotonic functions that allows for interpretable priors and principled quantification of hierarchical uncertainty is derived and it is demonstrated that the efficacy of the proposed model is demonstrated by providing competitive results to other probabilistic Monotonic models on a number of benchmark functions.
Learning effective stochastic differential equations from microscopic simulations: combining stochastic numerics and deep learning
TLDR
This work identifies effective stochastic differential equations for coarse observables of fine-grained particleor agent-based simulations and approximate the drift and diffusivity functions in these effective SDE through neural networks, which can be thought of as effective stoChastic ResNets.
Monotonic Gaussian Process Flow
TLDR
A nonparametric model of monotonic functions that allows for interpretable priors and principled quantification of hierarchical uncertainty is derived and it is demonstrated that the efficacy of the proposed model is demonstrated by providing competitive results to other probabilistic Monotonic models on a number of benchmark functions.
A Nonparametric Spatio-temporal SDE Model
TLDR
The experiments demonstrate that the spatio-temporal model is better able to fit a real world data set that has complex dynamics than the spatial model, and can also reduce the forecasting error.
A Nonparametric Spatio-temporal SDE Model
  • C. Citro
  • Mathematics, Computer Science
  • 2018
We propose a nonparametric spatio-temporal stochastic differential equation (SDE) 1 model that can learn the underlying dynamics of arbitrary continuous-time systems 2 without prior knowledge. We
...
...

References

SHOWING 1-10 OF 26 REFERENCES
Approximate Bayes learning of stochastic differential equations.
TLDR
A nonparametric approach for estimating drift and diffusion functions in systems of stochastic differential equations from observations of the state vector and an approximate expectation maximization algorithm to deal with the unobserved, latent dynamics between sparse observations are introduced.
Nonparametric estimation of stochastic differential equations with sparse Gaussian processes.
TLDR
This paper introduces a nonparametric method for estimating the drift and diffusion terms of SDEs from a densely observed discrete time series using a sparse Gaussian process approximation.
Gaussian Process Approximations of Stochastic Differential Equations
TLDR
A novel Gaussian process approximation to the posterior measure over paths for a general class of stochastic differential equations in the presence of observations is presented, and the results are very promising as the variational approximate solution outperforms standardGaussian process regression for non-Gaussian Markov processes.
Learning unknown ODE models with Gaussian processes
TLDR
This work proposes to learn non-linear, unknown differential functions from state observations using Gaussian process vector fields within the exact ODE formalism and demonstrates the model's capabilities to infer dynamics from sparse data and to simulate the system forward into future.
Parameter estimation for differential equations: a generalized smoothing approach
TLDR
A new method that uses noisy measurements on a subset of variables to estimate the parameters defining a system of non‐linear differential equations, based on a modification of data smoothing methods along with a generalization of profiled estimation is described.
A Unifying View of Sparse Approximate Gaussian Process Regression
TLDR
A new unifying view, including all existing proper probabilistic sparse approximations for Gaussian process regression, relies on expressing the effective prior which the methods are using, and highlights the relationship between existing methods.
Fitting population dynamic models to time-series data by gradient matching
We describe and test a method for fitting noisy differential equation models to a time series of population counts, motivated by stage-structured models of insect and zooplankton populations. We
Gaussian Processes for Machine Learning
TLDR
The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification.
Modeling magnetic fields using Gaussian processes
TLDR
The model is a Gaussian process which exploits the divergence- and curl-free properties of the magnetic field by combining well-known model components in a novel manner to produce Bayesian nonparametric maps of magnetized objects.
...
...