Model uncertainty and variable selection in Bayesian lasso regression

@article{Hans2010ModelUA,
  title={Model uncertainty and variable selection in Bayesian lasso regression},
  author={Chris Hans},
  journal={Statistics and Computing},
  year={2010},
  volume={20},
  pages={221-229}
}
  • Chris Hans
  • Published 1 April 2010
  • Computer Science
  • Statistics and Computing
While Bayesian analogues of lasso regression have become popular, comparatively little has been said about formal treatments of model uncertainty in such settings. This paper describes methods that can be used to evaluate the posterior distribution over the space of all possible regression models for Bayesian lasso regression. Access to the model space posterior distribution is necessary if model-averaged inference—e.g., model-averaged prediction and calculation of posterior variable inclusion… 

Hierarchical Bayesian formulations for selecting variables in regression models

Two applications in the contexts of binary and survival regression are presented, where the Bayesian approach was applied to select markers prognostically relevant for the development of rheumatoid arthritis and for overall survival in acute myeloid leukemia patients.

Model Space Priors for Objective Sparse Bayesian Regression

This paper investigates the construction of model space priors from an alternative point of view to the usual indicators for inclusion of covariates in a given model. Assumptions about indicator

PREDICTIVE MODEL SELECTION CRITERIA FOR BAYESIAN LASSO REGRESSION

We consider the Bayesian lasso for regression, which can be interpreted as an L 1 norm regularization based on a Bayesian approach when the Laplace or double-exponential prior distribution is placed

Parameter expansion in local-shrinkage models

This paper considers the problem of using MCMC to fit sparse Bayesian models based on normal scale-mixture priors. Examples of this framework include the Bayesian LASSO and the horseshoe prior. We

Bayesian sparse graphical models and their mixtures

A novel type of selection prior is introduced that develops a sparse structure on the precision matrix by making most of the elements exactly zero, in addition to ensuring positive definiteness—thus conducting model selection and estimation simultaneously.

High-dimensional variable selection via penalized credible regions with global-local shrinkage priors

The method of Bayesian variable selection via penalized credible regions separates model fitting and variable selection. The idea is to search for the sparsest solution within the joint posterior

Variable Selection via Penalized Credible Regions with Dirichlet–Laplace Global-Local Shrinkage Priors

This paper incorporates global-local priors into the credible region selection framework of Bayesian variable selection, and introduces a new method to tune hyperparameters in prior distributions for linear regression.

The reciprocal Bayesian LASSO

It is shown that the Bayesian formulation of the rLASSO problem outperforms its classical cousin in estimation, prediction, and variable selection across a wide range of scenarios while offering the advantage of posterior inference.

A New Bayesian Lasso.

This paper considers a fully Bayesian treatment that leads to a new Gibbs sampler with tractable full conditional posterior distributions and shows that the new algorithm has good mixing property and performs comparably to the existing Bayesian method in terms of both prediction accuracy and variable selection.
...

References

SHOWING 1-10 OF 32 REFERENCES

Bayesian Model Averaging for Linear Regression Models

Abstract We consider the problem of accounting for model uncertainty in linear regression models. Conditioning on a single selected model ignores model uncertainty, and thus leads to the

Bayesian lasso regression

New aspects of the broader Bayesian treatment of lasso regression are introduced, and it is shown that the standard lasso prediction method does not necessarily agree with model-based, Bayesian predictions.

The Bayesian Lasso

The Lasso estimate for linear regression parameters can be interpreted as a Bayesian posterior mode estimate when the regression parameters have independent Laplace (i.e., double-exponential) priors.

Alternative prior distributions for variable selection with very many more variables than observations

The problem of variable selection in regression and the generalised linear model is addressed. We adopt a Bayesian approach with priors for the regression coefficients that are scale mixtures of

The Practical Implementation of Bayesian Model Selection

This article illustrates some of the fundamental practical issues that arise for two different model selection problems: the variable selection problem for the linear model and the CART model selection problem.

Inference with normal-gamma prior distributions in regression problems

This paper considers the efiects of placing an absolutely continuous prior distribution on the regression coe-cients of a linear model. We show that the posterior expectation is a matrix-shrunken

Shotgun Stochastic Search for “Large p” Regression

A novel shotgun stochastic search (SSS) approach that explores “interesting” regions of the resulting high-dimensional model spaces and quickly identifies regions of high posterior probability over models.

Bayesian adaptive lassos with non-convex penalization

The Bayesian interpretation of the Lasso is adopted as the maximum a posteriori (MAP) estimate of the regression coefficients, which have been given independent, double exponential prior distributions, and the properties of this approach are explored.

Bayes and empirical-Bayes multiplicity adjustment in the variable-selection problem

This paper studies the multiplicity-correction effect of standard Bayesian variable-selection priors in linear regression. Our first goal is to clarify when, and how, multiplicity correction happens

Nonparametric regression using Bayesian variable selection