Likelihood-Free Inference with Deep Gaussian Processes

@article{Aushev2022LikelihoodFreeIW,
  title={Likelihood-Free Inference with Deep Gaussian Processes},
  author={Alexander Aushev and Henri Pesonen and Markus Heinonen and Jukka Corander and Samuel Kaski},
  journal={ArXiv},
  year={2022},
  volume={abs/2006.10571}
}
Warped Gradient-Enhanced Gaussian Process Surrogate Models for Inference with Intractable Likelihoods
TLDR
This article proposes the use of a warped, gradient-enhanced, Gaussian process surrogate model for the likelihood function, which jointly models the sample means and variances of the sufficient statistics, and uses warping functions to capture covariance nonstationarity in the input parameter space.
Likelihood-Free Inference in State-Space Models with Unknown Dynamics
TLDR
This work improves upon existing LFI methods for the inference task, while also accurately learning transition dynamics and uses a multi-output Gaussian process for state inference and a Bayesian Neural Network as a model of the transition dynamics for state prediction.
Benchmarking Simulation-Based Inference
TLDR
This work provides a benchmark with inference tasks and suitable performance metrics for ‘likelihood-free’ inference algorithms, with an initial selection of algorithms including recent approaches employing neural networks and classical Approximate Bayesian Computation methods.
Kuzmanovski, Vladimir; Hollmen, Jaakko Composite Surrogate for Likelihood-Free Bayesian Optimisation in High-Dimensional Settings of Activity-Based Transportation Models
TLDR
This study adopts Bayesian Optimisation for Likelihood-free Inference (BOLFI) method for calibrating the Preday activity-based model to a new urban area using a composite surrogate model that encompasses Random Forest surrogate model for modelling the discrepancy and Gaussian Mixture Model for estimating the its density.

References

SHOWING 1-10 OF 88 REFERENCES
Inference in Deep Gaussian Processes using Stochastic Gradient Hamiltonian Monte Carlo
TLDR
This work provides evidence for the non-Gaussian nature of the posterior and applies the Stochastic Gradient Hamiltonian Monte Carlo method to generate samples, which results in significantly better predictions at a lower computational cost than its VI counterpart.
Doubly Stochastic Variational Inference for Deep Gaussian Processes
TLDR
This work presents a doubly stochastic variational inference algorithm, which does not force independence between layers in Deep Gaussian processes, and provides strong empirical evidence that the inference scheme for DGPs works well in practice in both classification and regression.
Fast likelihood-free cosmology with neural density estimators and active learning
TLDR
NDEs are used to learn the likelihood function from a set of simulated datasets, with active learning to adaptively acquire simulations in the most relevant regions of parameter space on-the-fly, demonstrating the approach on a number of cosmological case studies.
Automatic Posterior Transformation for Likelihood-Free Inference
TLDR
Automatic posterior transformation (APT) is presented, a new sequential neural posterior estimation method for simulation-based inference that can modify the posterior estimate using arbitrary, dynamically updated proposals, and is compatible with powerful flow-based density estimators.
Deep Gaussian Processes with Importance-Weighted Variational Inference
TLDR
This work proposes a novel importance-weighted objective, which leverages analytic results and provides a mechanism to trade off computation for improved accuracy and demonstrates that the importance- Weighted objective works well in practice and consistently outperforms classical variational inference, especially for deeper models.
Bayesian Learning of Conditional Kernel Mean Embeddings for Automatic Likelihood-Free Inference
TLDR
KELFI is presented, a holistic framework that automatically learns model hyperparameters to improve inference accuracy given limited simulation budget and demonstrates improved accuracy and efficiency on challenging inference problems in ecology.
Gaussian Process Conditional Density Estimation
TLDR
This work proposes to extend the model's input with latent variables and use Gaussian processes to map this augmented input onto samples from the conditional distribution, and illustrates the effectiveness and wide-reaching applicability of the model on a variety of real-world problems.
Bayesian Synthetic Likelihood
TLDR
The accuracy and computational efficiency of the Bayesian version of the synthetic likelihood (BSL) approach is explored in comparison to a competitor known as approximate Bayesian computation (ABC) and its sensitivity to its tuning parameters and assumptions.
Deep Gaussian Processes
TLDR
Deep Gaussian process (GP) models are introduced and model selection by the variational bound shows that a five layer hierarchy is justified even when modelling a digit data set containing only 150 examples.
Scalable Bayesian Optimization Using Deep Neural Networks
TLDR
This work shows that performing adaptive basis function regression with a neural network as the parametric form performs competitively with state-of-the-art GP-based approaches, but scales linearly with the number of data rather than cubically, which allows for a previously intractable degree of parallelism.
...
...