• Corpus ID: 232290490

Scalable computation for Bayesian hierarchical models

@inproceedings{Papaspiliopoulos2021ScalableCF,
  title={Scalable computation for Bayesian hierarchical models},
  author={Omiros Papaspiliopoulos and Timoth'ee Stumpf-F'etizon and Giacomo Zanella},
  year={2021}
}
The article is about algorithms for learning Bayesian hierarchical models, the computational complexity of which scales linearly with the number of observations and the number of parameters in the model. It focuses on crossed random effect and nested multilevel models, which are used ubiquitously in applied sciences, and illustrates the methodology on two challenging real data analyses on predicting electoral results and real estate prices respectively. The posterior dependence in both classes… 

Exact Convergence Analysis for Metropolis-Hastings Independence Samplers in Wasserstein Distances

Under mild assumptions, it is shown the exact convergence rate in total variation is also exact in weaker Wasserstein distances for the Metropolis-Hastings independence sampler and can be upper bounded in Bayesian binary response regression when the sample size and dimension grow together.

Convergence rate of a collapsed Gibbs sampler for crossed random effects models

The convergence rate of a collapsed Gibbs sampler for crossed random effects models is analyzed to apply to a substantially larger range of models than previous works, including models that incorporate missingness mechanism and unbalanced level data.

References

SHOWING 1-10 OF 129 REFERENCES

Multilevel Linear Models, Gibbs Samplers and Multigrid Decompositions

A multigrid approach to derive analytic expressions for the convergence rates of the Gibbs Sampler for various widely used model structures, including nested and crossed random effects, while most previous work was limited to the two-level nested case.

Scalable inference for crossed random effects models

It is demonstrated empirically that the collapsed Gibbs sampler extended to sample precision hyperparameters significantly outperforms alternative state-of-the-art algorithms.

Fast and Accurate Estimation of Non-Nested Binomial Hierarchical Models Using Variational Inference

A new mean-field algorithm for estimating logistic hierarchical models with an arbitrary number of non-nested random effects and "marginally augmented variational Bayes" (MAVB) that further improves the initial approximation through a post-processing step are proposed.

On MCMC sampling in hierarchical longitudinal models

This paper constructs several MCMC algorithms for minimizing the autocorrelation in MCMC samples arising from important classes of longitudinal data models, and exploits an identity used by Chib (1995) in the context of Bayes factor computation to show how the parameters in a general linear mixed model may be updated in a single block, improving convergence and producing essentially independent draws from the posterior of the parameters of interest.

Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations

This work considers approximate Bayesian inference in a popular subset of structured additive regression models, latent Gaussian models, where the latent field is Gaussian, controlled by a few hyperparameters and with non‐Gaussian response variables and can directly compute very accurate approximations to the posterior marginals.

ESTIMATION AND INFERENCE FOR VERY LARGE LINEAR MIXED EFFECTS MODELS

A method of moments approach that takes account of the correlation structure and that can be computed at O(N) cost is proposed that is very amenable to parallel computation and it does not require parametric distributional assumptions, tuning parameters or convergence diagnostics.

Finite Mixture and Markov Switching Models

This book should help newcomers to the field to understand how finite mixture and Markov switching models are formulated, what structures they imply on the data, what they could be used for, and how they are estimated.

Divide-and-Conquer With Sequential Monte Carlo

A novel class of Sequential Monte Carlo algorithms appropriate for inference in probabilistic graphical models that adopts a divide-and-conquer approach based upon an auxiliary tree-structured decomposition of the model of interest, turning the overall inferential task into a collection of recursively solved subproblems.

Dimension free convergence rates for Gibbs samplers for Bayesian linear mixed models

  • Z. JinJ. Hobert
  • Computer Science, Mathematics
    Stochastic Processes and their Applications
  • 2022

Exact inference for a class of non-linear hidden Markov models

This article provides sufficient conditions for exact inference for a class of hidden Markov models on general state spaces given a set of discretely collected indirect observations linked non linearly to the signal, and aSet of practical algorithms for inference.
...