• Corpus ID: 246431091

On the proof of posterior contraction for sparse generalized linear models with multivariate responses

@inproceedings{Wang2022OnTP,
  title={On the proof of posterior contraction for sparse generalized linear models with multivariate responses},
  author={Shao-Hsuan Wang and Ray Bai and Hsin-Hsiung Huang},
  year={2022}
}
In recent years, the literature on Bayesian high-dimensional variable selection has rapidly grown. It is increasingly important to understand whether these Bayesian methods can consistently estimate the model parameters. To this end, shrinkage priors are useful for identifying relevant signals in high-dimensional data. For multivariate linear regression models with Gaussian response variables, Bai and Ghosh (2018) [5] proposed a multivariate Bayesian model with shrinkage priors (MBSP) for… 

References

SHOWING 1-10 OF 28 REFERENCES
High-dimensional multivariate posterior consistency under global-local shrinkage priors
Consistent High-Dimensional Bayesian Variable Selection via Penalized Credible Regions
TLDR
This work proposes a conjugate prior only on the full model parameters and use sparse solutions within posterior credible regions to perform selection, and shows that these sparse solutions can be computed via existing algorithms.
Nearly optimal Bayesian Shrinkage for High Dimensional Regression
TLDR
If the shrinkage prior has a heavy and flat tail, and allocates a sufficiently large probability mass in a very small neighborhood of zero, then its posterior properties are as good as those of the spike-and-slab prior.
Bayesian linear regression for multivariate responses under group sparsity
We study frequentist properties of a Bayesian high-dimensional multivariate linear regression model with correlated responses. The predictors are separated into many groups and the group structure is
Ultra high-dimensional multivariate posterior contraction rate under shrinkage priors
Posterior contraction in sparse generalized linear models
TLDR
This work studies posterior contraction rates in sparse high-dimensional generalized linear models using priors incorporating sparsity, and shows that Bayesian methods achieve convergence properties analogous to lasso-type procedures.
Bayesian Covariance Selection in Generalized Linear Mixed Models
TLDR
This article proposes a fully Bayesian approach to the problem of simultaneous selection of fixed and random effects in GLMMs, which relies on variable selection-type mixture priors for the components in a special Cholesky decomposition of the random effects covariance.
The EAS approach to variable selection for multivariate response data in high-dimensional settings
In this paper, we extend the epsilon admissible subsets (EAS) model selection approach, from its original construction in the high-dimensional linear regression setting, to an EAS framework for
Bayesian variable selection for high dimensional generalized linear models : Convergence rates of the fitted densities
TLDR
It is shown that it is possible to use Bayesian variable selection to reduce overfitting caused by the curse of dimensionality K » n, if most of the xj's have very small effects on the response y, and a suitable prior can be used to choose a few out of the many x j 's to model y.
...
...