LEARNING STOCHASTIC DIFFERENTIAL EQUATIONS WITH GAUSSIAN PROCESSES WITHOUT GRADIENT MATCHING

@article{Yildiz2018LEARNINGSD,
  title={LEARNING STOCHASTIC DIFFERENTIAL EQUATIONS WITH GAUSSIAN PROCESSES WITHOUT GRADIENT MATCHING},
  author={Çagatay Yildiz and Markus Heinonen and Jukka Intosalmi and Henrik Mannerstr{\"o}m and Harri L{\"a}hdesm{\"a}ki},
  journal={2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP)},
  year={2018},
  pages={1-6}
}
We introduce a novel paradigm for learning non-parametric drift and diffusion functions for stochastic differential equation (SDE). The proposed model learns to simulate path distributions that match observations with non-uniform time increments and arbitrary sparseness, which is in contrast with gradient matching that does not optimize simulated responses. We formulate sensitivity equations for learning and demonstrate that our general stochastic distribution optimisation leads to robust and… 

Figures from this paper

Deep learning with differential Gaussian process flows
TLDR
A novel deep learning paradigm of differential flows that learn a stochastic differential equation transformations of inputs prior to a standard classification or regression function is proposed, demonstrating excellent results as compared to deep Gaussian processes and Bayesian neural networks.
Sparse Gaussian Processes for Stochastic Differential Equations
TLDR
An approximate (variational) inference algorithm is derived and a novel parameterization of the approximate distribution over paths using a sparse Markovian Gaussian process is proposed, allowing the usage of well-established optimizing algorithms such as natural gradient descent for better convergence.
Learning stochastic dynamical systems with neural networks mimicking the Euler-Maruyama scheme
TLDR
A data driven approach where parameters of the SDE are represented by a neural network with a built-in SDE integration scheme and the loss function is based on a maximum likelihood criterion, under order one Markov Gaussian assumptions.
Infinite-dimensional optimization and Bayesian nonparametric learning of stochastic differential equations
The paper has two major themes. The first part of the paper establishes certain general results for infinite-dimensional optimization problems on Hilbert spaces. These results cover the classical
Monotonic Gaussian Process Flows
TLDR
A nonparametric model of monotonic functions that allows for interpretable priors and principled quantification of hierarchical uncertainty is derived and it is demonstrated that the efficacy of the proposed model is demonstrated by providing competitive results to other probabilistic Monotonic models on a number of benchmark functions.
Learning effective stochastic differential equations from microscopic simulations: combining stochastic numerics and deep learning
TLDR
This work identifies effective stochastic differential equations for coarse observables of fine-grained particleor agent-based simulations and approximate the drift and diffusivity functions in these effective SDE through neural networks, which can be thought of as effective stoChastic ResNets.
Monotonic Gaussian Process Flow
TLDR
A nonparametric model of monotonic functions that allows for interpretable priors and principled quantification of hierarchical uncertainty is derived and it is demonstrated that the efficacy of the proposed model is demonstrated by providing competitive results to other probabilistic Monotonic models on a number of benchmark functions.
Scalable Inference in SDEs by Direct Matching of the Fokker-Planck-Kolmogorov Equation
TLDR
This work revisits the classical SDE literature and derive direct approximations to the (typically intractable) Fokker–Planck–Kolmogorov equation by matching moments by addressing the issue of sampling schemes.
A Nonparametric Spatio-temporal SDE Model
TLDR
The experiments demonstrate that the spatio-temporal model is better able to fit a real world data set that has complex dynamics than the spatial model, and can also reduce the forecasting error.
Identifying Latent Stochastic Differential Equations
TLDR
This work uses recent results on identifiability of latent variable models to show that the proposed model can recover not only the underlying SDE coefficients, but also the original latent variables, up to an isometry, in the limit of infinite data.
...
...

References

SHOWING 1-10 OF 26 REFERENCES
Approximate Bayes learning of stochastic differential equations.
TLDR
A nonparametric approach for estimating drift and diffusion functions in systems of stochastic differential equations from observations of the state vector and an approximate expectation maximization algorithm to deal with the unobserved, latent dynamics between sparse observations are introduced.
Approximate Gaussian process inference for the drift function in stochastic differential equations
TLDR
A nonparametric approach for estimating drift functions in systems of stochastic differential equations from incomplete observations of the state vector is introduced and an approximate EM algorithm is developed to deal with the unobserved, latent dynamics between observations.
Nonparametric estimation of stochastic differential equations with sparse Gaussian processes.
TLDR
This paper introduces a nonparametric method for estimating the drift and diffusion terms of SDEs from a densely observed discrete time series using a sparse Gaussian process approximation.
Gaussian Process Approximations of Stochastic Differential Equations
TLDR
A novel Gaussian process approximation to the posterior measure over paths for a general class of stochastic differential equations in the presence of observations is presented, and the results are very promising as the variational approximate solution outperforms standardGaussian process regression for non-Gaussian Markov processes.
Learning unknown ODE models with Gaussian processes
TLDR
This work proposes to learn non-linear, unknown differential functions from state observations using Gaussian process vector fields within the exact ODE formalism and demonstrates the model's capabilities to infer dynamics from sparse data and to simulate the system forward into future.
Gaussian Processes for Big Data
TLDR
Stochastic variational inference for Gaussian process models is introduced and it is shown how GPs can be variationally decomposed to depend on a set of globally relevant inducing variables which factorize the model in the necessary manner to perform Variational inference.
Stochastic differential equations : an introduction with applications
Some Mathematical Preliminaries.- Ito Integrals.- The Ito Formula and the Martingale Representation Theorem.- Stochastic Differential Equations.- The Filtering Problem.- Diffusions: Basic
Gaussian Process Dynamical Models for Human Motion
TLDR
This work marginalize out the model parameters in closed form by using Gaussian process priors for both the dynamical and the observation mappings, which results in a nonparametric model for dynamical systems that accounts for uncertainty in the model.
Parameter estimation for differential equations: a generalized smoothing approach
TLDR
A new method that uses noisy measurements on a subset of variables to estimate the parameters defining a system of non‐linear differential equations, based on a modification of data smoothing methods along with a generalization of profiled estimation is described.
...
...