Corpus ID: 236087632

Decoupling Shrinkage and Selection for the Bayesian Quantile Regression

  title={Decoupling Shrinkage and Selection for the Bayesian Quantile Regression},
  author={David Kohns and T. Szendrei},
This paper extends the idea of decoupling shrinkage and sparsity for continuous priors to Bayesian Quantile Regression (BQR). The procedure follows two steps: In the first step, we shrink the quantile regression posterior through state of the art continuous priors and in the second step, we sparsify the posterior through an efficient variant of the adaptive lasso, the signal adaptive variable selection (SAVS) algorithm. We propose a new variant of the SAVS which automates the choice of… Expand


Bayesian adaptive Lasso quantile regression
Recently, variable selection by penalized likelihood has attracted much research interest. In this paper, we propose adaptive Lasso quantile regression (BALQR) from a Bayesian perspective. The methodExpand
Conjugate priors and variable selection for Bayesian quantile regression
An extension of the Zellners prior which allows for a conditional conjugate prior and quantile dependent prior on Bayesian quantile regression is proposed and a novel prior based on percentage bend correlation for model selection is also used in Bayesian regression for the first time. Expand
Signal Adaptive Variable Selector for the Horseshoe Prior
In this article, we propose a simple method to perform variable selection as a post model-fitting exercise using continuous shrinkage priors such as the popular horseshoe prior. The proposed SignalExpand
Posterior Inference in Bayesian Quantile Regression with Asymmetric Laplace Likelihood
Summary The paper discusses the asymptotic validity of posterior inference of pseudo-Bayesian quantile regression methods with complete or censored data when an asymmetric Laplace likelihood isExpand
Nowcasting Tail Risks to Economic Activity with Many Indicators
This paper focuses on tail risk nowcasts of economic activity, measured by GDP growth, with a potentially wide array of monthly and weekly information. We consider different models (Bayesian mixedExpand
Spike and slab variable selection: Frequentist and Bayesian strategies
Variable selection in the linear regression model takes many apparent faces from both frequentist and Bayesian standpoints. In this paper we introduce a variable selection method referred to as aExpand
The Bayesian Lasso
The Lasso estimate for linear regression parameters can be interpreted as a Bayesian posterior mode estimate when the regression parameters have independent Laplace (i.e., double-exponential) priors.Expand
Inducing Sparsity and Shrinkage in Time-Varying Parameter Models
Abstract Time-varying parameter (TVP) models have the potential to be over-parameterized, particularly when the number of variables in the model is large. Global-local priors are increasingly used toExpand
Shrink Globally, Act Locally: Sparse Bayesian Regularization and Prediction
We study the classic problem of choosing a prior distribution for a location parameter β = (β1, . . . , βp) as p grows large. First, we study the standard “global-local shrinkage” approach, based onExpand
Regression Shrinkage and Selection via the Lasso
SUMMARY We propose a new method for estimation in linear models. The 'lasso' minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than aExpand