• Corpus ID: 51875354

Learning unknown ODE models with Gaussian processes

@inproceedings{Heinonen2018LearningUO,
  title={Learning unknown ODE models with Gaussian processes},
  author={Markus Heinonen and Çagatay Yildiz and Henrik Mannerstr{\"o}m and Jukka Intosalmi and Harri L{\"a}hdesm{\"a}ki},
  booktitle={ICML},
  year={2018}
}
In conventional ODE modelling coefficients of an equation driving the system state forward in time are estimated. However, for many complex systems it is practically impossible to determine the equations or interactions governing the underlying dynamics. In these settings, parametric ODE model cannot be formulated. Here, we overcome this issue by introducing a novel paradigm of nonparametric ODE modelling that can learn the underlying dynamics of arbitrary continuous-time systems without prior… 

Figures and Tables from this paper

Variational Bridge Constructs for Grey Box Modelling with Gaussian Processes
TLDR
This paper introduces a method for inference of heterogeneous dynamical systems where part of the dynamics are known, in the form of an ordinary differential equation (ODEs), with some functional input that is unknown, and shows how it can be extended to models with non-Gaussian likelihoods, such as count data.
Bayesian inference of ODEs with Gaussian processes
TLDR
This work proposes a novel Bayesian nonparametric model that uses Gaussian processes to infer posteriors of unknown ODE systems directly from data, and derives sparse variational inference with decoupled functional sampling to represent vector field posteriors.
Learning Interacting Dynamical Systems with Latent Gaussian Process ODEs
TLDR
A new model that decomposes independent dynamics of single objects accurately from their interactions by employing latent Gaussian process ordinary differential equations and it is observed that only this model can successfully encapsulate independent dynamics and interaction information in distinct functions.
ODIN: ODE-Informed Regression for Parameter and State Inference in Time-Continuous Dynamical Systems
TLDR
This work introduces a novel generative modeling approach based on constrained Gaussian processes and leverages it to build a computationally and data efficient algorithm for state and parameter inference.
Black-Box Inference for Non-Linear Latent Force Models
TLDR
This paper uses black-box variational inference to jointly estimate the posterior ofLatent force models, designing a multivariate extension to local inverse autoregressive flows as a flexible approximater of the system.
Graphical modelling in continuous-time: consistency guarantees and algorithms using Neural ODEs
TLDR
A score-based learning algorithm based on penalized Neural Ordinary Differential Equations is proposed that is shown to be applicable to the general setting of irregularly-sampled multivariate time series and to outperform the state of the art across a range of dynamical systems.
Learning ODE Models with Qualitative Structure Using Gaussian Processes
TLDR
This work proposes an approach to learning a vector field of differential equations using sparse Gaussian Processes that allows us to combine data and additional structural information, like Lie Group symmetries and fixed points, and shows that this combination improves extrapolation and long-term behaviour, and reduces computational cost.
Variational multiple shooting for Bayesian ODEs with Gaussian processes
TLDR
A novel Bayesian nonparametric model that uses Gaussian processes to infer posteriors of unknown ODE systems directly from data is proposed and sparse variational inference with decoupled functional sampling is derived to represent vector field posteriors.
Gaussian processes meet NeuralODEs: a Bayesian framework for learning the dynamics of partially observed systems from scarce and noisy data
TLDR
The proposed GP-NODE method takes advantage of differentiable programming to propagate gradient information through ordinary differential equation solvers and perform Bayesian inference with respect to unknown model parameters using Hamiltonian Monte Carlo sampling and Gaussian Process priors over the observed system states.
MAGI-X: Manifold-Constrained Gaussian Process Inference for Unknown System Dynamics
TLDR
A fast and accurate data-driven method to learn the unknown dynamic from the observation data in a non-parametric fashion, without the need of any domain knowledge, within the MAnifold-constrained Gaussian process Inference (MAGI) framework that completely circumvents the numerical integration.
...
...

References

SHOWING 1-10 OF 49 REFERENCES
Accelerating Bayesian Inference over Nonlinear Differential Equations with Gaussian Processes
TLDR
This work presents an accelerated sampling procedure which enables Bayesian inference of parameters in nonlinear ordinary and delay differential equations via the novel use of Gaussian processes (GP).
Learning nonparametric differential equations with operator-valued kernels and gradient matching
TLDR
This work introduces a general framework for nonparametric ODE models using penalized regression in Reproducing Kernel Hilbert Spaces (RKHS) based on operator-valued kernels and extends the scope of gradient matching approaches to nonparametrical ODE.
Gaussian Processes for Bayesian Estimation in Ordinary Differential Equations
TLDR
A Gaussian process model is proposed that directly links state derivative information with system observations, simplifying previous approaches and improving estimation accuracy in coupled ordinary differential equations.
Gaussian Process Approximations of Stochastic Differential Equations
TLDR
A novel Gaussian process approximation to the posterior measure over paths for a general class of stochastic differential equations in the presence of observations is presented, and the results are very promising as the variational approximate solution outperforms standardGaussian process regression for non-Gaussian Markov processes.
Nonparametric estimation of stochastic differential equations with sparse Gaussian processes.
TLDR
This paper introduces a nonparametric method for estimating the drift and diffusion terms of SDEs from a densely observed discrete time series using a sparse Gaussian process approximation.
Gaussian Process Dynamical Models
TLDR
This paper marginalize out the model parameters in closed-form, using Gaussian Process (GP) priors for both the dynamics and the observation mappings, resulting in a nonparametric model for dynamical systems that accounts for uncertainty in the model.
Variational Dependent Multi-output Gaussian Process Dynamical Systems
TLDR
The proposed model has superiority on modeling dynamical systems under the more reasonable assumption and the fully Bayesian learning framework and can be flexibly extended to handle regression problems.
Bayesian inference for differential equations
Inferring solutions of differential equations using noisy multi-fidelity data
Variational Gaussian Process State-Space Models
TLDR
This work presents a procedure for efficient variational Bayesian learning of nonlinear state-space models based on sparse Gaussian processes and offers the possibility to straightforwardly trade off model capacity and computational cost whilst avoiding overfitting.
...
...