Least angle regression
@article{Efron2004LeastAR, title={Least angle regression}, author={Bradley Efron and Trevor J. Hastie and Iain M. Johnstone and Robert Tibshirani}, journal={Annals of Statistics}, year={2004}, volume={32}, pages={407-499} }
The purpose of model selection algorithms such as All Subsets, Forward Selection and Backward Elimination is to choose a linear model on the basis of the same set of data to which the model will be applied. Typically we have available a large collection of possible covariates from which we hope to select a parsimonious set for the efficient prediction of a response variable. Least Angle Regression (LARS), a new model selection algorithm, is a useful and less greedy version of traditional…
Figures and Tables from this paper
8,226 Citations
Linear model selection based on extended robust least angle regression
- Computer Science
- 2012
The Extended Robust LARS is proposed by proposing the generalized definitions of correlations which includes the correlations between nominal variables and quantitative variables.
Efficient least angle regression for identification of linear-in-the-parameters models
- MathematicsProceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences
- 2017
A detailed computational complexity analysis indicates that the proposed algorithm possesses significant computational efficiency, compared with the original approach where the well-known efficient Cholesky decomposition is involved in solving least angle regression.
Variable Inclusion and Shrinkage Algorithms
- Computer Science
- 2008
It is shown through extensive simulations that VISA significantly outperforms the Lasso and also provides improvements over more recent procedures, such as the Dantzig selector, relaxed Lasso, and adaptive Lasso.
A Modified Least Angle Regression Algorithm for Hierarchical Interaction
- Computer Science
- 2008
The heredity structure between the main and interaction effect can be considered, algorithms for LASSO with heredITY structure cannot be executed if the number of main effects is high, due to computational burden when thenumber of covariates is large.
Variable Selection in Linear Regression With Many Predictors
- Mathematics
- 2009
With advanced capability in data collection, applications of linear regression analysis now often involve a large number of predictors. Variable selection thus has become an increasingly important…
A Survey of Methods in Variable Selection and Penalized Regression
- Computer Science
- 2020
This paper reviews variable selection methods in linear regression, grouped into two categories: sequential methods, such as forward selection, backward elimination, and stepwise regression; and penalized methods, also called shrinkage or regularization methods, including the LASSO, elastic net, and so on.
Parameter Selection Algorithm For Continuous Variables
- Mathematics
- 2017
In this article, we propose a new algorithm for supervised learning methods, by which one can both capture the non-linearity in data and also find the best subset model. To produce an enhanced subset…
Variable Selection Using a Smooth Information Criterion for Distributional Regression Models
- Mathematics
- 2021
Modern variable selection procedures make use of penalization methods to execute simultaneous model selection and estimation. A popular method is the LASSO (least absolute shrinkage and selection…
Improved variable selection with Forward-Lasso adaptive shrinkage
- Computer Science
- 2011
This work proposes a new approach, "Forward-Lasso Adaptive SHrinkage" (FLASH), which includes the Lasso and Forward Selection as special cases, and can be used in both the linear regression and the Generalized Linear Model domains.
Robust Model Selection with LARS Based on S-estimators
- MathematicsCOMPSTAT
- 2010
This work introduces outlier robust versions of the LARS algorithm based on S-estimators for regression based on Rousseeuw and Yohai (1984) and shows that this algorithm is computationally efficient and suitable even when the number of variables exceeds the sample size.
References
SHOWING 1-10 OF 40 REFERENCES
The Little Bootstrap and other Methods for Dimensionality Selection in Regression: X-Fixed Prediction Error
- Mathematics
- 1992
Abstract When a regression problem contains many predictor variables, it is rarely wise to try to fit the data by means of a least squares regression on all of the predictor variables. Usually, a…
Gaussian model selection
- Computer Science
- 2001
Abstract.Our purpose in this paper is to provide a general approach to model selection via penalization for Gaussian regression and to develop our point of view about this subject. The advantage and…
Regression Shrinkage and Selection via the Lasso
- Computer Science
- 1996
A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.
A new approach to variable selection in least squares problems
- Mathematics, Computer Science
- 2000
A compact descent method for solving the constrained problem for a particular value of κ is formulated, and a homotopy method, in which the constraint bound κ becomes the Homotopy parameter, is developed to completely describe the possible selection regimes.
Calibration and empirical Bayes variable selection
- Mathematics
- 2000
For the problem of variable selection for the normal linear model, selection criteria such as AIC, C p , BIC and RIC have fixed dimensionality penalties. Such criteria are shown to correspond to…
On the LASSO and its Dual
- Computer Science, Mathematics
- 2000
Consideration of the primal and dual problems together leads to important new insights into the characteristics of the LASSO estimator and to an improved method for estimating its covariance matrix.
Linear Model Selection by Cross-validation
- Mathematics
- 1993
Abstract We consider the problem of selecting a model having the best predictive ability among a class of linear models. The popular leave-one-out cross-validation method, which is asymptotically…
Greedy function approximation: A gradient boosting machine.
- Computer Science
- 2001
A general gradient descent boosting paradigm is developed for additive expansions based on any fitting criterion, and specific algorithms are presented for least-squares, least absolute deviation, and Huber-M loss functions for regression, and multiclass logistic likelihood for classification.
The risk inflation criterion for multiple regression
- Economics
- 1994
A new criterion is proposed for the evaluation of variable selection procedures in multiple regression. This criterion, which we call the risk inflation, is based on an adjustment to the risk.…
On Measuring and Correcting the Effects of Data Mining and Model Selection
- Computer Science
- 1998
The concept of GDF offers a unified framework under which complex and highly irregular modeling procedures can be analyzed in the same way as classical linear models and many difficult problems can be solved easily.