• Corpus ID: 88522998

Fast approximate Bayesian inference for stable differential equation models

  title={Fast approximate Bayesian inference for stable differential equation models},
  author={Philip Maybank and Ingo Bojak and Richard G. Everitt},
  journal={arXiv: Computation},
Inference for mechanistic models is challenging because of nonlinear interactions between model parameters and a lack of identifiability. Here we focus on a specific class of mechanistic models, which we term stable differential equations. The dynamics in these models are approximately linear around a stable fixed point of the system. We exploit this property to develop fast approximate methods for posterior inference. We illustrate our approach using simulated data on a mechanistic… 

Figures and Tables from this paper

MCMC for Bayesian Uncertainty Quantification from Time-Series Data
This paper presents software for Bayesian uncertainty quantification in the parameters of NPMs from approximately stationary data using Markov Chain Monte Carlo (MCMC).
Parameter estimation and identifiability in a neural population model for electro-cortical activity
Fits of a 22-parameter neural population model to EEG spectra from 82 different subjects, all exhibiting alpha-oscillations, indicate that inhibition plays a central role in the generation and modulation of the alpha-rhythm in humans.
Parameter estimation and identifiability in a neural population model for electro-cortical activity
Unbiased fits of a 22-parameter neural population model to EEG data from 82 individuals are performed, showing that the eigenvalues of the Fisher information matrix are roughly uniformly spaced over a log scale, indicating that the model is sloppy, like many of the regulatory network models in systems biology.
Spectral density-based and measure-preserving ABC for partially observed diffusion processes. An illustration on Hamiltonian SDEs
The derived summaries are particularly robust to the model simulation, and this fact, combined with the proposed reliable numerical scheme, yields accurate ABC inference.
Ensemble MCMC: Accelerating Pseudo-Marginal MCMC for State Space Models using the Ensemble Kalman Filter
This paper exploits the ensemble Kalman filter (EnKF) developed in the data assimilation literature to speed up pMCMC and demonstrates that the new ensemble MCMC (eMCMC) method can significantly reduce the computational cost whilst maintaining reasonable accuracy.
Bootstrapped synthetic likelihood
The use of the bag of little bootstraps is investigated as a means for applying this approach to large datasets, yielding to Monte Carlo algorithms that accurately approximate posterior distributions whilst only simulating subsamples of the full data.
27th Annual Computational Neuroscience Meeting (CNS*2018): Part Two
Robust regulation of neuronal dynamics by the Na/K pump Gennady Cymbalyuk, Christian Erxleben, Angela Wenning‐Erxelben, Ronald Calabrese Georgia State University, Neuroscience Institute, Atlanta, GA, United States.
MCMC for Bayesian uncertainty quantification from time-series data
C++ software for Bayesian uncertainty quantification in the parameters of NPMs from approximately stationary data using Markov Chain Monte Carlo (MCMC) is presented.


Inference for SDE Models via Approximate Bayesian Computation
This work considers simulation studies for a pharmacokinetics/pharmacodynamics model and for stochastic chemical reactions and provides a Matlab package that implements the proposed ABC-MCMC algorithm.
Markov chain Monte Carlo approach to parameter estimation in the FitzHugh-Nagumo model.
A Bayesian framework for parameter estimation is proposed, which can handle multidimensional nonlinear diffusions with large time scale separation and is illustrated on simulated data.
Dynamic causal modeling with neural fields
Dynamic causal models of steady-state responses
Annealed Importance Sampling for Neural Mass Models
This paper implements AIS using proposals derived from Langevin Monte Carlo (LMC) which uses local gradient and curvature information for efficient exploration of parameter space and finds evidence of non-Gaussianity in their posterior distribution.
Online Bayesian Inference in Some Time-Frequency Representations of Non-Stationary Processes
A new framework that allows the routine use of Bayesian inference for online estimation of the time-varying spectral density of a locally stationary Gaussian process using a likelihood inspired by a local Whittle approximation is introduced.
On Particle Methods for Parameter Estimation in State-Space Models
This paper presents a comprehensive review of particle methods that have been proposed to perform static parameter estimation in state-space models and discusses the advantages and limitations of these methods and illustrates their performance on simple models.
Delayed acceptance particle MCMC for exact inference in stochastic kinetic models
The method is illustrated by considering inference for parameters governing a Lotka–Volterra system, a model of gene expression and a simple epidemic process to avoid expensive calculations for proposals that are likely to be rejected.
Riemann manifold Langevin and Hamiltonian Monte Carlo methods
The methodology proposed automatically adapts to the local structure when simulating paths across this manifold, providing highly efficient convergence and exploration of the target density, and substantial improvements in the time‐normalized effective sample size are reported when compared with alternative sampling approaches.
Importance sampling type correction of Markov chain Monte Carlo and exact approximations
This work uses an importance sampling (IS) type correction of approximate Markov chain Monte Carlo (MCMC) output in order to provide consistent estimators and proves strong consistency of the suggested estimators under mild assumptions, and provides central limit theorems with expressions for asymptotic variances.