Bayesian inference on high-dimensional multivariate binary responses
@inproceedings{Chakraborty2021BayesianIO, title={Bayesian inference on high-dimensional multivariate binary responses}, author={Antik Chakraborty and Rihui Ou and David B. Dunson}, year={2021} }
It has become increasingly common to collect high-dimensional binary response data; for example, with the emergence of new sampling techniques in ecology. In smaller dimensions, multivariate probit (MVP) models are routinely used for inferences. However, algorithms for fitting such models face issues in scaling up to high dimensions due to the intractability of the likelihood, involving an integral over a multivariate normal distribution having no analytic form. Although a variety of algorithms…
References
SHOWING 1-10 OF 45 REFERENCES
Bayesian sparse multiple regression for simultaneous rank reduction and variable selection.
- Computer ScienceBiometrika
- 2020
A carefully devised shrinkage prior is considered on the matrix of regression coefficients which obviates the need to specify a prior on the rank, and shrinks the regression matrix towards low-rank and row-sparse structures.
Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations
- Computer Science
- 2009
This work considers approximate Bayesian inference in a popular subset of structured additive regression models, latent Gaussian models, where the latent field is Gaussian, controlled by a few hyperparameters and with non‐Gaussian response variables and can directly compute very accurate approximations to the posterior marginals.
Composite Likelihood Methods
- Computer Science
- 2012
This issue includes two long overview papers, one of which is devoted to applications in statistical genetics; several papers developing new theory for inference based on composite likelihood; new results in the application of composite likelihood to time series, spatial processes, longitudinal data and missing data.
Large-scale inference of correlation among mixed-type biological traits with phylogenetic multivariate probit models
- Computer Science
- 2019
This work develops a new inference approach that exploits the bouncy particle sampler based on piecewise deterministic Markov processes to simultaneously sample all truncated normal dimensions, and develops novel dynamic programming that reduces the cost of likelihood and gradient evaluations for BPS to linear in sample size.
MCMC for Imbalanced Categorical Data
- Computer ScienceJournal of the American Statistical Association
- 2018
This article derives theoretical results on the computational complexity of commonly used data augmentation algorithms and the Random Walk Metropolis algorithm for highly imbalanced binary data and shows that MCMC algorithms that exhibit a similar discrepancy will fail in large samples—a result with substantial practical impact.
Markov chain Monte Carlo with the Integrated Nested Laplace Approximation
- Computer ScienceStat. Comput.
- 2018
A novel approach is presented that combines INLA and Markov chain Monte Carlo (MCMC) and can be used to fit models with Laplace priors in a Bayesian Lasso model, imputation of missing covariates in linear models, fitting spatial econometrics models with complex nonlinear terms in the linear predictor and classification of data with mixture models.
Spatial Models with the Integrated Nested Laplace Approximation within Markov Chain Monte Carlo
- Computer Science
- 2017
This paper describes how to use INLA within the Metropolis-Hastings algorithm to fit spatial models and estimate the joint posterior distribution of a reduced number of parameters.
End-to-End Learning for the Deep Multivariate Probit Model
- Computer ScienceICML
- 2018
This work proposes a flexible deep generalization of the classic MVP, the Deep Multivariate Probit Model (DMVP), which is an end-to-end learning scheme that uses an efficient parallel sampling process of the multivariate probit model to exploit GPU-boosted deep neural networks.
Model robust inference with two-stage maximum likelihood estimation for copulas
- MathematicsJ. Multivar. Anal.
- 2019
Variational Inference: A Review for Statisticians
- Computer ScienceArXiv
- 2016
Variational inference (VI), a method from machine learning that approximates probability densities through optimization, is reviewed and a variant that uses stochastic optimization to scale up to massive data is derived.