Automatic approximation of the marginal likelihood in non-Gaussian hierarchical models

@article{Skaug2006AutomaticAO,
  title={Automatic approximation of the marginal likelihood in non-Gaussian hierarchical models},
  author={Hans Julius Skaug and David A. Fournier},
  journal={Comput. Stat. Data Anal.},
  year={2006},
  volume={51},
  pages={699-709}
}

Figures and Tables from this paper

Greater Than the Sum of its Parts: Computationally Flexible Bayesian Hierarchical Modeling
We propose a multistage method for making inference at all levels of a Bayesian hierarchical model (BHM) using natural data partitions to increase efficiency by allowing computations to take place in
A flexible and automated likelihood based framework for inference in stochastic volatility models
AD Model Builder: using automatic differentiation for statistical inference of highly parameterized complex nonlinear models
TLDR
The basic components and the underlying philosophy of ADMB are described, with an emphasis on functionality found in no other statistical software, and the main advantages are flexibility, speed, precision, stability and built-in methods to quantify uncertainty.
Spherical radial approximation for nested mixed effects models
TLDR
A new quadrature approximation method is proposed, which is based on the spherical radial integration approach of Monahan and Genz and takes advantage of the hierarchical structure of the integration, and is applied to estimation of GAMMs.
Automatic differentiation and laplace approximation
TMB is an open source R package that enables quick implementation of complex nonlinear random effects (latent variable) models in a manner similar to the established AD Model Builder package ( ADMB ,
Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations
TLDR
This work considers approximate Bayesian inference in a popular subset of structured additive regression models, latent Gaussian models, where the latent field is Gaussian, controlled by a few hyperparameters and with non‐Gaussian response variables and can directly compute very accurate approximations to the posterior marginals.
Automated Likelihood Based Inference for Stochastic Volatility
TLDR
It is found that the new methods match the statistical efficiency of the existing classical methods and substantially reduce the simula- tion inefficiency in some existing Bayesian Markov chain Monte Carlo algorithms.
Approximate Posterior Inference for Multiple Testing using a Hierarchical Mixed-effect Poisson Regression Model
We present an approximate posterior inference  methodology for a Bayesian hierarchical mixed-effect Poisson regression model. The model serves us to address the multiple testing problem in the
Automated Likelihood Based Inference for Stochastic Volatility Models
TLDR
It is found that the new methods match the statistical efficiency of the existing classical methods and substantially reduce the simulation inefficiency in some existing Bayesian Markov chain Monte Carlo (MCMC) algorithms.
...
...

References

SHOWING 1-10 OF 28 REFERENCES
Automatic Differentiation to Facilitate Maximum Likelihood Estimation in Nonlinear Random Effects Models
Maximum likelihood estimation in random effects models for non-Gaussian data is a computationally challenging task that currently receives much attention. This article shows that the estimation
Maximum Likelihood for Generalized Linear Models with Nested Random Effects via High-Order, Multivariate Laplace Approximation
TLDR
A comparison with approximations based on penalized quasi-likelihood, Gauss—Hermite quadrature, and adaptive Gauss-Hermitequadrature reveals that, for the hierarchical logistic regression model under the simulated conditions, the sixth-order Laplace approach is remarkably accurate and computationally fast.
Pointwise and functional approximations in Monte Carlo maximum likelihood estimation
TLDR
A comparison between the marginal likelihood and the recently proposed hierarchical likelihood which avoids integration altogether is compared, which is applied to fit a latent process model to a set of polio incidence data.
Laplace Importance Sampling for Generalized Linear Mixed Models
It is well known that the standard Laplace approximation of the integrated marginal likelihood function of a random effects model may be invalid if the dimension of the integral increases with the
Accurate Approximations for Posterior Moments and Marginal Densities
TLDR
These approximations to the posterior means and variances of positive functions of a real or vector-valued parameter, and to the marginal posterior densities of arbitrary parameters can also be used to compute approximate predictive densities.
Approximating the marginal likelihood estimate for models with random parameters
  • B. Bell
  • Mathematics, Computer Science
    Appl. Math. Comput.
  • 2001
Negative binomial loglinear mixed models
The Poisson loglinear model is a common choice for explaining variability in counts. However, in many practical circumstances the restriction that the mean and variance are equal is not realistic.
BUGS for a Bayesian Analysis of Stochastic Volatility Models
TLDR
The main purpose is to illustrate the ease with which the Bayesian stochastic volatility model can now be studied routinely via BUGS (Bayesian Inference Using Gibbs Sampling), a recently developed, user-friendly, and freely available software package.
Bias correction in generalised linear mixed models with a single component of dispersion
SUMMARY General expressions are derived for the asymptotic biases in three approximate estimators of regression coefficients and variance component, for small values of the variance component, in
In all likelihood : statistical modelling and inference using likelihood
TLDR
This paper presents a meta-modelling framework for estimating the likelihood of random parameters in a discrete-time environment and describes its use in simple and complex models.
...
...