Bayesian Computing with INLA: A Review

@inproceedings{Rue2016BayesianCW,
  title={Bayesian Computing with INLA: A Review},
  author={H. Rue and A. Riebler and S. S{\o}rbye and J. Illian and Daniel P. Simpson and F. Lindgren},
  year={2016}
}
The key operation in Bayesian inference is to compute high-dimensional integrals. An old approximate technique is the Laplace method or approximation, which dates back to Pierre-Simon Laplace (1774). This simple idea approximates the integrand with a second-order Taylor expansion around the mode and computes the integral analytically. By developing a nested version of this classical idea, combined with modern numerical techniques for sparse matrices, we obtain the approach of integrated nested… Expand

Figures from this paper

Integrated Nested Laplace Approximations (INLA)
TLDR
The theory behind INLA is outlined, the R-INLA package is presented and new developments of combining INLA with MCMC for models that are not possible to fit with R- INLA are described. Expand
Bayesian model averaging with the integrated nested Laplace approximation
TLDR
The use of BMA with INLA is reviewed and a new example on spatial econometrics models is proposed that can be extended by means of Bayesian model averaging to increase the number of models it can fit to conditional latent GMRF. Expand
Multivariate posterior inference for spatial models with the integrated nested Laplace approximation
TLDR
This work describes how to use the integrated nested Laplace approximation within the Metropolis–Hastings algorithm to fit complex spatial models and to estimate the joint posterior distribution of a small number of parameters. Expand
Markov chain Monte Carlo with the Integrated Nested Laplace Approximation
TLDR
A novel approach is presented that combines INLA and Markov chain Monte Carlo (MCMC) and can be used to fit models with Laplace priors in a Bayesian Lasso model, imputation of missing covariates in linear models, fitting spatial econometrics models with complex nonlinear terms in the linear predictor and classification of data with mixture models. Expand
Latent Gaussian modeling and INLA: A review with focus on space-time applications
  • T. Opitz
  • Mathematics, Computer Science
  • 2017
TLDR
The principal theoretical concepts, model classes and inference tools within the INLA framework are reviewed, and a comprehensive simulation experiment is presented using simulated non Gaussian space-time count data with a first-order autoregressive dependence structure in time. Expand
Practical bounds on the error of Bayesian posterior approximations: A nonasymptotic approach
TLDR
This work develops a flexible new approach to bounding the error of mean and uncertainty estimates of scalable inference algorithms, and demonstrates the usefulness of the Fisher distance approach by deriving bounds on the Wasserstein error of the Laplace approximation and Hilbert coresets. Expand
Hamiltonian Monte Carlo using an embedded Laplace approximation
Latent Gaussian models are a popular class of hierarchical models with applications in many fields. Performing Bayesian inference on such models can be challenging. Markov chain Monte CarloExpand
Statistical computation with kernels
TLDR
This thesis analyzes a well-known algorithm for numerical integration called Bayesian quadrature, and provides consistency and contraction rates, and studies two minimum distance estimators derived from kernel-based statistical divergences which can be used for unnormalised and generative models. Expand
Laplace approximation for fast Bayesian inference in generalized additive models based on penalized regression splines
Generalized additive models (GAMs) are a well-established statistical tool for modeling complex nonlinear relationships between covariates and a response assumed to have a conditional distribution inExpand
Max-and-Smooth: A Two-Step Approach for Approximate Bayesian Inference in Latent Gaussian Models
TLDR
Max-and-Smooth, an approximate Bayesian inference scheme for a flexible class of latent Gaussian models (LGMs) where one or more of the likelihood parameters are modeled by latent additive Gaussian processes, is introduced. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 133 REFERENCES
Laplace Expansions in Markov chain Monte Carlo Algorithms
Complex hierarchical models lead to a complicated likelihood and then, in a Bayesian analysis, to complicated posterior distributions. To obtain Bayes estimates such as the posterior mean or BayesianExpand
Bayesian computing with INLA: New features
The INLA approach for approximate Bayesian inference for latent Gaussian models has been shown to give fast and accurate estimates of posterior marginals and also to be a valuable tool in practiceExpand
Approximate Bayesian Inference for Latent Gaussian Models
This thesis consists of five papers, presented in chronological order. Their content is summarised in this section.Paper I introduces the approximation tool for latent GMRF models and discusses, inExpand
Spatial Data Analysis with R-INLA with Some Extensions
The integrated nested Laplace approximation (INLA) provides an interesting way of approximating the posterior marginals of a wide range of Bayesian hierarchical models. This approximation is based onExpand
Improving the INLA approach for approximate Bayesian inference for latent Gaussian models
AbstractWeintroduceanewcopula-basedcorrectionforgeneralizedlinearmixedmodels(GLMMs)within the integrated nested Laplace approximation (INLA) approach for approximateBayesian inference for latentExpand
Bayesian analysis of measurement error models using integrated nested Laplace approximations
type="main" xml:id="rssc12069-abs-0001"> To account for measurement error (ME) in explanatory variables, Bayesian approaches provide a flexible framework, as expert knowledge can be incorporated inExpand
Bayesian bivariate meta-analysis of diagnostic test studies using integrated nested Laplace approximations.
TLDR
A comparison of a new Bayesian deterministic inference approach for latent Gaussian models using integrated nested Laplace approximations (INLA) and its results indicate that INLA is more stable and gives generally better coverage probabilities for the pooled estimates and less biased estimates of variance parameters. Expand
Asymptotic normality of posterior distributions for generalized linear mixed models
TLDR
This paper establishes the asymptotic normality of the joint posterior distribution of the parameters and the random effects in a GLMM by using Stein's Identity and illustrates that the approximate normal distribution performs reasonably using both real and simulated data. Expand
Accurate Approximations for Posterior Moments and Marginal Densities
Abstract This article describes approximations to the posterior means and variances of positive functions of a real or vector-valued parameter, and to the marginal posterior densities of arbitraryExpand
On asymptotic validity of approximate likelihood inference
Many statistical models have likelihoods which are intractable: it is impossible or too expensive to compute the likelihood exactly. In such settings, it is common to replace the likelihood with anExpand
...
1
2
3
4
5
...