# Consistent High-Dimensional Bayesian Variable Selection via Penalized Credible Regions

@article{Bondell2012ConsistentHB, title={Consistent High-Dimensional Bayesian Variable Selection via Penalized Credible Regions}, author={Howard D. Bondell and Brian J. Reich}, journal={Journal of the American Statistical Association}, year={2012}, volume={107}, pages={1610 - 1624} }

For high-dimensional data, particularly when the number of predictors greatly exceeds the sample size, selection of relevant predictors for regression is a challenging problem. Methods such as sure screening, forward selection, or penalized regressions are commonly used. Bayesian variable selection methods place prior distributions on the parameters along with a prior over model space, or equivalently, a mixture prior on the parameters having mass at zero. Since exhaustive enumeration is not…

## 84 Citations

High-dimensional variable selection via penalized credible regions with global-local shrinkage priors

- Mathematics
- 2016

The method of Bayesian variable selection via penalized credible regions separates model fitting and variable selection. The idea is to search for the sparsest solution within the joint posterior…

Variable Selection via Penalized Credible Regions with Dirichlet–Laplace Global-Local Shrinkage Priors

- MathematicsBayesian Analysis
- 2018

This paper incorporates global-local priors into the credible region selection framework of Bayesian variable selection, and introduces a new method to tune hyperparameters in prior distributions for linear regression.

Efficient Bayesian Regularization for Graphical Model Selection.

- Computer ScienceBayesian analysis
- 2019

This work proposes a novel graphical model selection approach for large dimensional settings where the dimension increases with the sample size, by decoupling model fitting and covariance selection, and applies it to a cancer genomics data example.

Bayesian Variable Selection Utilizing Posterior Probability Credible Intervals

- Computer SciencemedRxiv
- 2021

This work proposes a novel variable selection algorithm that utilizes the parameters credible intervals to select the variables to be kept in the model and shows this algorithm yields comparable or better results comparing to DIC in a simulation study, as well as implementing this algorithm in a real-world example.

High-Dimensional Posterior Consistency for Hierarchical Non-Local Priors in Regression

- Mathematics, Computer ScienceBayesian Analysis
- 2020

This paper introduces a fully Bayesian approach with the pMOM nonlocal prior where an appropriate Inverse-Gamma prior is placed on the tuning parameter to analyze a more robust model that is comparatively immune to misspecification of scale parameter.

Priors for Bayesian Shrinkage and High-Dimensional Model Selection

- Computer Science
- 2017

This dissertation investigates the asymptotic form of the marginal likelihood based on the nonlocal priors and shows that it attains a unique penalty term that adapts to the strength of signal corresponding variable in the model, and remark that this term cannot be attained from local priors such as Gaussian prior densities.

Bayesian variable selection with shrinking and diffusing priors

- Mathematics
- 2014

We consider a Bayesian approach to variable selection in the presence of high dimensional covariates based on a hierarchical model that places prior distributions on the regression coefficients as…

Scalable Bayesian Variable Selection Using Nonlocal Prior Densities in Ultrahigh-dimensional Settings.

- Computer ScienceStatistica Sinica
- 2018

It is found that Bayesian variable selection procedures based on nonlocal priors are competitive to all other procedures in a range of simulation scenarios, and this favorable performance is explained through a theoretical examination of their consistency properties.

Bayesian Sparse Global-Local Shrinkage Regression for Selection of Grouped Variables

- Computer Science
- 2017

This paper presents a Bayesian grouped model with continuous global-local shrinkage priors to handle complex group hierarchies that include overlapping and multilevel group structures and provides an alternative degrees of freedom estimator for sparse Bayesian linear models that takes into account the effects of shrinkage on the model coefficients.

## References

SHOWING 1-10 OF 58 REFERENCES

Bayesian Variable Selection in Clustering High-Dimensional Data

- Computer Science, Mathematics
- 2005

This article formulate the clustering problem in terms of a multivariate normal mixture model with an unknown number of components and use the reversible-jump Markov chain Monte Carlo technique to define a sampler that moves between different dimensional spaces.

Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties

- Mathematics, Computer Science
- 2001

In this article, penalized likelihood approaches are proposed to handle variable selection problems, and it is shown that the newly proposed estimators perform as well as the oracle procedure in variable selection; namely, they work as well if the correct submodel were known.

APPROACHES FOR BAYESIAN VARIABLE SELECTION

- Mathematics
- 1997

This paper describes and compares various hierarchical mixture prior formulations of variable selection uncertainty in normal linear regression models. These include the nonconjugate SSVS formulation…

Calibration and empirical Bayes variable selection

- Mathematics
- 2000

For the problem of variable selection for the normal linear model, selection criteria such as AIC, C p , BIC and RIC have fixed dimensionality penalties. Such criteria are shown to correspond to…

Bayes model averaging with selection of regressors

- Economics
- 2002

Summary. When a number of distinct models contend for use in prediction, the choice of a single model can offer rather unstable predictions. In regression, stochastic search variable selection with…

Adaptive Lasso for sparse high-dimensional regression models

- Mathematics, Computer Science
- 2008

The adaptive Lasso has the oracle property even when the number of covariates is much larger than the sample size, and under a partial orthogonality condition in which the covariates with zero coefficients are weakly correlated with the covariate with nonzero coefficients, marginal regression can be used to obtain the initial estimator.

Variable selection via Gibbs sampling

- Mathematics
- 1993

Abstract A crucial problem in building a multiple regression model is the selection of predictors to include. The main thrust of this article is to propose and develop a procedure that uses…

One-step Sparse Estimates in Nonconcave Penalized Likelihood Models.

- Computer ScienceAnnals of statistics
- 2008

A new unified algorithm based on the local linear approximation for maximizing the penalized likelihood for a broad class of concave penalty functions and shows that if the regularization parameter is appropriately chosen, the one-step LLA estimates enjoy the oracle properties with good initial estimators.

High-dimensional graphs and variable selection with the Lasso

- Computer Science
- 2006

It is shown that neighborhood selection with the Lasso is a computationally attractive alternative to standard covariance selection for sparse high-dimensional graphs and is hence equivalent to variable selection for Gaussian linear models.

Mixtures of g Priors for Bayesian Variable Selection

- Mathematics
- 2008

Zellner's g prior remains a popular conventional prior for use in Bayesian variable selection, despite several undesirable consistency issues. In this article we study mixtures of g priors as an…