Fast and Accurate Estimation of Non-Nested Binomial Hierarchical Models Using Variational Inference

@article{Goplerud2021FastAA,
  title={Fast and Accurate Estimation of Non-Nested Binomial Hierarchical Models Using Variational Inference},
  author={Max Goplerud},
  journal={Bayesian Analysis},
  year={2021}
}
  • Max Goplerud
  • Published 24 July 2020
  • Computer Science
  • Bayesian Analysis
Estimating non-linear hierarchical models can be computationally burdensome in the presence of large datasets and many non-nested random effects. Popular inferential techniques may take hours to fit even relatively straightforward models. This paper provides two contributions to scalable and accurate inference. First, I propose a new mean-field algorithm for estimating logistic hierarchical models with an arbitrary number of non-nested random effects. Second, I propose "marginally augmented… 

Efficient data augmentation techniques for some classes of state space models

TLDR
The proposed data augmentation scheme is utilized to design efficient Markov chain Monte Carlo algorithms for Bayesian inference of some non-Gaussian and nonlinear state space models, via a mixture of normals approximation coupled with a block-speci fic reparametrization strategy.

Normal Approximation for Bayesian Mixed Effects Binomial Regression Models

. Bayesian inference for generalized linear mixed models implemented with Markov chain Monte Carlo (MCMC) sampling methods have been widely used. In this paper, we propose to substitute a large

Polling India via regression and post-stratification of non-probability online samples

TLDR
A modified MrP model of Indian vote preferences is executed that proposes innovations to each of its three core components: stratification frame, training data, and a learner; and a novel Data Integration approach that allows the simultaneous estimation of counts from multiple complementary sources, such as census tables and auxiliary surveys.

Scalable computation for Bayesian hierarchical models

TLDR
Algorithms for learning Bayesian hierarchical models, which are used ubiquitously in applied sciences, are focused on crossed random effect and nested multilevel models and the methodology on two challenging real data analyses on predicting electoral results and real estate prices respectively is illustrated.

References

SHOWING 1-10 OF 69 REFERENCES

Automatic Differentiation Variational Inference

TLDR
Automatic differentiation variational inference (ADVI) is developed, where the scientist only provides a probabilistic model and a dataset, nothing else, and the algorithm automatically derives an efficient Variational inference algorithm, freeing the scientist to refine and explore many models.

Using Redundant Parameterizations to Fit Hierarchical Models

TLDR
The ultimate goal is to develop a fast and reliable method for fitting a hierarchical linear model as easily as one can now fit a nonhierarchical model, and to increase understanding of Gibbs samplers for hierarchical models in general.

Deep Interactions with MRP: Election Turnout and Voting Patterns Among Small Electoral Subgroups

Usingmultilevelregressionandpoststratification(MRP),weestimatevoterturnoutandvotechoicewithindeeplyinteracted subgroups: subsets of the population that are defined by multiple demographic and

Parameter Expanded Variational Bayesian Methods

TLDR
Parameter-eXpanded Variational Bayesian (PX-VB) methods to speed up VB and demonstrate the superior convergence rates of Px-VB in variational probit regression and automatic relevance determination.

Parameter Expansion for Data Augmentation

TLDR
A parameter expanded data augmentation (PX-DA) algorithm is rigorously defined and a new theory for iterative conditional sampling under the tra… to understand the role of the expansion parameter.

Bayesian Inference for Logistic Models Using Pólya–Gamma Latent Variables

We propose a new data-augmentation strategy for fully Bayesian inference in models with binomial likelihoods. The approach appeals to a new class of Pólya–Gamma distributions, which are constructed

Stacked Regression and Poststratification

TLDR
In a Monte Carlo simulation, SRP significantly outperforms MRP when there are deep interactions in the data generating process, without requiring the researcher to specify a complex parametric model in advance.

Streamlined Variational Inference for Linear Mixed Models with Crossed Random Effects

TLDR
This article provides full algorithmic details of three variational inference strategies, presents detailed empirical results on their pros and cons and guides the users on their choice of variational inferential inference approach depending on the problem size and computing resources.

BARP: Improving Mister P Using Bayesian Additive Regression Trees

  • James Bisbee
  • Mathematics
    American Political Science Review
  • 2019
Multilevel regression and post-stratification (MRP) is the current gold standard for extrapolating opinion data from nationally representative surveys to smaller geographic units. However,

A Contrastive Divergence for Combining Variational Inference and MCMC

TLDR
This work develops a method to combine Markov chain Monte Carlo (MCMC) and variational inference (VI), leveraging the advantages of both inference approaches, and introduces the variational contrastive divergence (VCD), a new divergence that replaces the standard Kullback-Leibler (KL) divergence used in VI.
...