A new approach to variable selection in least squares problems

@article{Osborne2000ANA,
  title={A new approach to variable selection in least squares problems},
  author={Michael R. Osborne and Brett Presnell and Berwin A. Turlach},
  journal={Ima Journal of Numerical Analysis},
  year={2000},
  volume={20},
  pages={389-403}
}
The title Lasso has been suggested by Tibshirani (1996) as a colourful name for a technique of variable selection which requires the minimization of a sum of squares subject to an l 1 bound κ on the solution. This forces zero components in the minimizing solution for small values of κ. Thus this bound can function as a selection parameter. This paper makes two contributions to computational problems associated with implementing the Lasso: (1) a compact descent method for solving the… 

Simultaneous Variable Selection

A new method for selecting a common subset of explanatory variables where the aim is to model several response variables based on the (joint) residual sum of squares while constraining the parameter estimates to lie within a suitable polyhedral region is proposed.

Active Set Algorithms for the LASSO

This thesis disserts on the computation of the Least Absolute Shrinkage and Selection Operator (LASSO) and derivate problems, in regression analysis, and examines how three algorithms (active set, homotopy, and coordinate descent) can handle some limit cases, and can be applied to extended problems.

On the LASSO and its Dual

Consideration of the primal and dual problems together leads to important new insights into the characteristics of the LASSO estimator and to an improved method for estimating its covariance matrix.

The Iso-regularization Descent Algorithm for the LASSO

An adaptation of this algorithm that solves the regularized problem, has a simpler formulation, and outperforms state-of-the-art algorithms in terms of speed is given.

The Iso-lambda Descent Algorithm for the LASSO

An adaptation of this algorithm that solves the regularized problem, has a simpler formulation, and outperforms state-of-the-art al- gorithms in terms of speed is given.

Addendum: Regularization and variable selection via the elastic net

The piecewise linearity of the lasso solution path was first proved by Osborne et al. (2000), who also described an efficient algorithm for calculating the complete lasso solutions path.

A gradient-based optimization algorithm for LASSO

  • Kim
  • Computer Science
  • 2008
The gradient LASSO algorithm is computationally more stable than QP based algorithms because it does not require matrix inversions, and thus it can be more easily applied to high dimensional data.

Gradient LASSO for feature selection

This paper proposes a gradient descent algorithm for LASSO, and provides the convergence rate of the algorithm, and illustrates it with simulated models as well as real data sets.

A Homotopy Algorithm for the Quantile Regression Lasso and Related Piecewise Linear Problems

We show that the homotopy algorithm of Osborne, Presnell, and Turlach (2000), which has proved such an effective optimal path following method for implementing Tibshirani’s “lasso” for variable

Safe optimization algorithms for variable selection and hyperparameter tuning

This work proposes a unified framework for identifying important structures in these convex optimization problems and introduces the "Gap Safe Screening Rules", a recently introduced technique to ignore some variables during the optimization process by benefiting from the expected sparsity of the solutions.
...

Regression Shrinkage and Selection via the Lasso

A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.

Resolving degeneracy in quadratic programming

A technique for the resolution of degeneracy in an Active Set Method for Quadratic Programming is described, which generalises Fletcher's method [2] which applies to the LP case and gives stronger guarantees than are available with other popular methods.

An effective method for computing regression quantiles

Regression quantiles were introduced in Koenker & Bassett [7] as quantities of interest in developing robust estimation procedures. They can be computed by linear programming combined with post

The Levenberg-Marquardt algo-rithm: Implementation and theory

A conduit arrangement for a tilt cylinder of a bulldozer comprises a trunnion having a hole to be connected to the hole in a truck frame and a conduit adapted to be connected through the frame to the

On Linear Restricted and Interval Least-Squares Problems

On etudie deux classes d'algorithmes pour la resolution du probleme des moindres carres lineaire sous contraintes

Applied Regression Analysis

  • R. Gunst
  • Computer Science
    Technometrics
  • 1999