On the LASSO and its dual

@article{Osborne2000OnTL,
  title={On the LASSO and its dual},
  author={Michael R. Osborne and Brett Presnell and Berwin A. Turlach},
  journal={Journal of Computational and Graphical Statistics},
  year={2000},
  volume={9},
  pages={319-337}
}
Abstract Proposed by Tibshirani, the least absolute shrinkage and selection operator (LASSO) estimates a vector of regression coefficients by minimizing the residual sum of squares subject to a constraint on the l 1-norm of the coefficient vector. The LASSO estimator typically has one or more zero elements and thus shares characteristics of both shrinkage estimation and variable selection. In this article we treat the LASSO as a convex programming problem and derive its dual. Consideration of… 

Tables from this paper

Active Set Algorithms for the LASSO
TLDR
This thesis disserts on the computation of the Least Absolute Shrinkage and Selection Operator (LASSO) and derivate problems, in regression analysis, and examines how three algorithms (active set, homotopy, and coordinate descent) can handle some limit cases, and can be applied to extended problems.
The dual and degrees of freedom of linearly constrained generalized lasso
The Iso-regularization Descent Algorithm for the LASSO
TLDR
An adaptation of this algorithm that solves the regularized problem, has a simpler formulation, and outperforms state-of-the-art algorithms in terms of speed is given.
Applications of l 1 regularisation
TLDR
The lasso algorithm for variable selection in linear models works by imposing an l1 norm bound constraint on the variables in a least squares model and then tuning the model estimation calculation using this bound, resulting in the study of the selection problem for dierent objective and constraint choices.
The Iso-lambda Descent Algorithm for the LASSO
TLDR
An adaptation of this algorithm that solves the regularized problem, has a simpler formulation, and outperforms state-of-the-art al- gorithms in terms of speed is given.
Applications of l1 regularisation
TLDR
The piecewise linear estimation problem can be solved for each value of the $l_1$~bound by a relatively efficient simplicial descent algorithm, and that this can be used to explore trajectory information in a manner which is at least competitive with the homotopy algorithm in this context.
Estimation Consistency of the Group Lasso and its Applications
TLDR
The main theorem shows that the group Lasso achieves estimation consistency under a mild condition and an asymptotic upper bound on the number of selected variables can be obtained.
A complex version of the LASSO algorithm and its application to beamforming
TLDR
This paper proposes an alternative complex version of the LASSO algorithm applied to beamforming aiming to decrease the overall computational complexity by zeroing some weights and presents the results of simulations for various values of coefficient vector L1-norm such that distinct amounts of null values appear in the coefficient vector.
The sparsity and bias of the Lasso selection in high-dimensional linear regression
Meinshausen and Buhlmann [Ann. Statist. 34 (2006) 1436-1462] showed that, for neighborhood selection in Gaussian graphical models, under a neighborhood stability condition, the LASSO is consistent,
The solution path of the generalized lasso
TLDR
This work derives an unbiased estimate of the degrees of freedom of the generalized lasso fit for an arbitrary D, which turns out to be quite intuitive in many applications.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 34 REFERENCES
Regression Shrinkage and Selection via the Lasso
TLDR
A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.
A new approach to variable selection in least squares problems
TLDR
A compact descent method for solving the constrained problem for a particular value of κ is formulated, and a homotopy method, in which the constraint bound κ becomes the Homotopy parameter, is developed to completely describe the possible selection regimes.
Penalized Regressions: The Bridge versus the Lasso
TLDR
It is shown that the bridge regression performs well compared to the lasso and ridge regression, and is demonstrated through an analysis of a prostate cancer data.
Block Coordinate Relaxation Methods for Nonparametric Wavelet Denoising
TLDR
This article investigates an alternative optimization approach based on block coordinate relaxation (BCR) for sets of basis functions that are the finite union of sets of orthonormal basis functions (e.g., wavelet packets), and shows that the BCR algorithm is globally convergent, and empirically, the B CR algorithm is faster than the IP algorithm for a variety of signal denoising problems.
Subset Selection in Regression
OBJECTIVES Prediction, Explanation, Elimination or What? How Many Variables in the Prediction Formula? Alternatives to Using Subsets 'Black Box' Use of Best-Subsets Techniques LEAST-SQUARES
Block coordinate relaxation methods for nonparametric signal denoising with wavelet dictionaries
TLDR
It is shown that the B CR algorithm is globally convergent, and empirically, the BCR algorithm is faster than the IP algorithm for a variety of signal denoising problems.
On the asymptotic performance of median smoothers in image analysis and nonparametric regression
For d-dimensional images and regression functions the true object is estimated by median smoothing. The mean square error of the median smoother is calculated using the framework of M-estimation, and
Statistical modelling and latent variables
Criminometrics, Latent Variables, and Panel Data (J. Aasness, E. Eide, T. Skjerpen). Scale Construction by Maximizing Reliability (D.J. Bartholomew, M. Knott). Finite Sample Properties of Limited
Atomic Decomposition by Basis Pursuit
TLDR
Basis Pursuit (BP) is a principle for decomposing a signal into an "optimal" superposition of dictionary elements, where optimal means having the smallest l1 norm of coefficients among all such decompositions.
...
1
2
3
4
...