GRADIENT-INDUCED MODEL-FREE VARIABLE SELECTION WITH COMPOSITE QUANTILE REGRESSION

@article{Lv2018GRADIENTINDUCEDMV,
  title={GRADIENT-INDUCED MODEL-FREE VARIABLE SELECTION WITH COMPOSITE QUANTILE REGRESSION},
  author={Shaogao Lv and Xin He and Junhui Wang},
  journal={Statistica Sinica},
  year={2018},
  volume={28},
  pages={1521-1538}
}
Variable selection is central to sparse modeling, and many methods have been proposed under various model assumptions. Most existing methods are based on an explicit functional relationship, while we are concerned with a model-free variable selection method that attempts to identify informative variables that are related to the response by simultaneously examining the sparsity in multiple conditional quantile functions. It does not require specification of the underlying model for the response… 

Figures and Tables from this paper

A gradient-based variable selection for binary classification in reproducing kernel Hilbert space
TLDR
A gradient-based representation of the large-margin classifier is proposed and then regularize the gradient functions by the group-lasso penalty to obtain sparse gradients that naturally lead to the variable selection.
Learning sparse conditional distribution: An efficient kernel-based approach
TLDR
A novel method to recover the sparse structure of the conditional distribution, which plays a crucial role in subsequent statistical analysis such as prediction, forecasting, conditional distribution estimation and others, which can be efficiently implemented by optimizing its dual form.
Detection of similar successive groups in a model with diverging number of variable groups
Abstract In this article, a linear model with grouped explanatory variables is considered. The idea is to perform an automatic detection of different successive groups of the unknown coefficients
Sparse Learning in reproducing kernel Hilbert space
TLDR
A unified and universal method for learning sparsity of M-estimators within a rich family of loss functions in a reproducing kernel Hilbert space (RKHS) that works for general loss function, admits general dependence structure, allows for efficient computation, and with theoretical guarantee.
Bayesian reciprocal LASSO quantile regression
The reciprocal LASSO estimate for linear regression corresponds to a posterior mode when independent inverse Laplace priors are assigned on the regression coefficients. This paper studies reciproca...
Gradient-induced Model-free Variable Selection Based on Composite Quantile Regression in Reproducing Kernel Hilbert Space
Variable selection plays an important role to identify truly informative variables in high-dimensional data analysis. In his paper, we propose a variable selection method with composite quantile

References

SHOWING 1-10 OF 40 REFERENCES
Model-free Variable Selection in Reproducing Kernel Hilbert Space
TLDR
A model-free variable selection method via learning the gradient functions based on the equivalence between whether a variable is informative and whether its corresponding gradient function is substantially non-zero is introduced.
Consistent selection of tuning parameters via variable selection stability
TLDR
A general tuning parameter selection criterion based on variable selection stability is introduced to select the tuning parameters so that the resultant penalized regression model is stable in variable selection.
Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
TLDR
In this article, penalized likelihood approaches are proposed to handle variable selection problems, and it is shown that the newly proposed estimators perform as well as the oracle procedure in variable selection; namely, they work as well if the correct submodel were known.
CONSISTENT VARIABLE SELECTION IN ADDITIVE MODELS
We propose a penalized polynomial spline method for simultaneous model estimation and variable selection in additive models. It approximates nonparametric functions by polynomial splines, and
Model selection and estimation in regression with grouped variables
Summary.  We consider the problem of selecting grouped variables (factors) for accurate prediction in regression. Such a problem arises naturally in many practical situations with the multifactor
L1-Norm Quantile Regression
Classical regression methods have focused mainly on estimating conditional mean functions. In recent years, however, quantile regression has emerged as a comprehensive approach to the statistical
Variable Selection and Function Estimation in Additive Nonparametric Regression Using a Data-Based Prior
Abstract A hierarchical Bayesian approach is proposed for variable selection and function estimation in additive nonparametric Gaussian regression models and additive nonparametric binary regression
Composite quantile regression and the oracle Model Selection Theory
Coefficient estimation and variable selection in multiple linear regression is routinely done in the (penalized) least squares (LS) framework. The concept of model selection oracle introduced by Fan
VARIABLE SELECTION IN QUANTILE REGRESSION
TLDR
The decomposition of the SCAD penalty function is taken as the difference of two convex functions and proposed to solve the corresponding optimization using the Difference Convex Algorithm (DCA).
A unified penalized method for sparse additive quantile models: an RKHS approach
TLDR
A new sparsity-smoothness penalty over a reproducing kernel Hilbert space (RKHS) is proposed, which includes linear function and spline-based nonlinear function as special cases and majorize-minimization forward splitting iterative algorithm is developed for efficient computation.
...
...