The Bayesian Lasso
@article{Park2008TheBL, title={The Bayesian Lasso}, author={Trevor H Park and George Casella}, journal={Journal of the American Statistical Association}, year={2008}, volume={103}, pages={681 - 686} }
The Lasso estimate for linear regression parameters can be interpreted as a Bayesian posterior mode estimate when the regression parameters have independent Laplace (i.e., double-exponential) priors. Gibbs sampling from this posterior is possible using an expanded hierarchy with conjugate normal priors for the regression parameters and independent exponential priors on their variances. A connection with the inverse-Gaussian distribution provides tractable full conditional distributions. The…
2,484 Citations
Bayesian lasso regression
- Computer Science
- 2009
New aspects of the broader Bayesian treatment of lasso regression are introduced, and it is shown that the standard lasso prediction method does not necessarily agree with model-based, Bayesian predictions.
A New Bayesian Lasso.
- Computer ScienceStatistics and its interface
- 2014
This paper considers a fully Bayesian treatment that leads to a new Gibbs sampler with tractable full conditional posterior distributions and shows that the new algorithm has good mixing property and performs comparably to the existing Bayesian method in terms of both prediction accuracy and variable selection.
Priors on the Variance in Sparse Bayesian Learning; the demi-Bayesian Lasso
- Computer Science
- 2008
This work outlines simple modifications of existing algorithms to solve this new variant which essentially uses type-II maximum likelihood to fit the Bayesian Lasso model and proposes an Elastic-net heuristic to help with modeling correlated inputs.
Approximate Gibbs sampler for Bayesian Huberized lasso
- Computer Science
- 2022
A new posterior computation algorithm for the Bayesian Huberized lasso regression is proposed based on the approximation of full conditional distribution and it is possible to estimate a tuning parameter for robustness of the pseudo-Huber loss function.
Sparsity via new Bayesian Lasso
- Computer Science
- 2020
This paper proposed Scale Mixture of Normals mixing with Rayleigh density on their variances to represent the double exponential distribution and proposed Hierarchical model formulation presented with Gibbs sampler under SMNR as alternative Bayesian analysis of minimization problem of classical lasso.
Sparse modifying algorithm in Bayesian lasso
- Computer Science
- 2014
In the present pape4 the authors propase aiL erncient algorithm which modifies the Bayesian lasso estimates so as to be sparse, to investigate the ernciency of the proposed aLgorithm.
High-Dimensional Bayesian Regularised Regression with the BayesReg Package
- Computer Science, Mathematics
- 2016
This paper introduces bayesreg, a new toolbox for fitting Bayesian penalized regression models with continuous shrinkage prior densities, and features Bayesian linear regression with Gaussian or heavy-tailed error models and Bayesian logistic regression with ridge, lasso, horseshoes and horseshoe estimators.
Robust Bayesian Regularized Estimation Based on Regression Model
- Computer Science, Mathematics
- 2015
A new robust coefficient estimation and variable selection method based on Bayesian adaptive Lasso regression, developed based on the Bayesian hierarchical model framework, where the distribution is treated as a mixture of normal and gamma distributions and put different penalization parameters for different regression coefficients.
Sparse Bayesian linear regression using generalized normal priors
- Computer Science, MathematicsInt. J. Wavelets Multiresolution Inf. Process.
- 2017
A sparse Bayesian linear regression model is proposed that generalizes the Bayesian Lasso to a class of Bayesian models with scale mixtures of normal distributions as priors for the regression…
References
SHOWING 1-10 OF 41 REFERENCES
Bayesian Variable Selection in Linear Regression
- Mathematics
- 1988
Abstract This article is concerned with the selection of subsets of predictor variables in a linear regression model for the prediction of a dependent variable. It is based on a Bayesian approach,…
Outlier Models and Prior Distributions in Bayesian Linear Regression
- Mathematics
- 1984
SUMMARY Bayesian inference in regression models is considered using heavy-tailed error distri- butions to accommodate outliers. The particular class of distributions that can be con- structed as…
Penalized regression, standard errors, and Bayesian lassos
- Computer Science
- 2010
The performance of the Bayesian lassos is compared to their fre- quentist counterparts using simulations, data sets that previous lasso papers have used, and a di-cult modeling problem for predicting the collapse of governments around the world.
Regression Shrinkage and Selection via the Lasso
- Computer Science
- 1996
A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.
Efficient Empirical Bayes Variable Selection and Estimation in Linear Models
- Computer Science, Mathematics
- 2005
Simulations and real examples show that the proposed method is very competitive in terms of variable selection, estimation accuracy, and computation speed compared with other variable selection and estimation methods.
Flexible empirical Bayes estimation for wavelets
- Mathematics
- 2000
Wavelet shrinkage estimation is an increasingly popular method for signal denoising and compression. Although Bayes estimators can provide excellent mean‐squared error (MSE) properties, the selection…
APPROACHES FOR BAYESIAN VARIABLE SELECTION
- Mathematics
- 1997
This paper describes and compares various hierarchical mixture prior formulations of variable selection uncertainty in normal linear regression models. These include the nonconjugate SSVS formulation…
Variable selection via Gibbs sampling
- Mathematics
- 1993
Abstract A crucial problem in building a multiple regression model is the selection of predictors to include. The main thrust of this article is to propose and develop a procedure that uses…
Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Mathematics, Computer Science
- 2001
In this article, penalized likelihood approaches are proposed to handle variable selection problems, and it is shown that the newly proposed estimators perform as well as the oracle procedure in variable selection; namely, they work as well if the correct submodel were known.
Adaptive Sparseness for Supervised Learning
- Computer ScienceIEEE Trans. Pattern Anal. Mach. Intell.
- 2003
A Bayesian approach to supervised learning, which leads to sparse solutions; that is, in which irrelevant parameters are automatically set exactly to zero, and involves no tuning or adjustment of sparseness-controlling hyperparameters.