# LASSO reloaded: a variational analysis perspective with applications to compressed sensing

@article{Berk2022LASSORA, title={LASSO reloaded: a variational analysis perspective with applications to compressed sensing}, author={Aaron Berk and Simone Brugiapaglia and Tim Hoheisel}, journal={ArXiv}, year={2022}, volume={abs/2205.06872} }

. This paper provides a variational analysis of the unconstrained formulation of the LASSO problem, ubiquitous in statistical learning, signal processing, and inverse problems. In particular, we establish smoothness results for the optimal value as well as Lipschitz properties of the optimal solution as functions of the right-hand side (or measurement vector ) and the regularization parameter. Moreover, we show how to apply the proposed variational analysis to study the sensitivity of the…

## References

SHOWING 1-10 OF 53 REFERENCES

Sensitivity of ℓ1 minimization to parameter choice

- Computer Science
- 2020

This work investigates stability of each LASSO program with respect to its governing parameter by analyzing the case where the measurement matrix is identity and the so-called proximal denoising setup and using `1 regularization.

The LASSO Risk for Gaussian Matrices

- Computer Science, MathematicsIEEE Transactions on Information Theory
- 2012

This result is the first rigorous derivation of an explicit formula for the asymptotic mean square error of the LASSO for random instances and is based on the analysis of AMP, a recently developed efficient algorithm that is inspired from graphical model ideas.

On the sparsity of LASSO minimizers in sparse data recovery

- Computer Science
- 2020

It is proved that if the data is k-sparse, then the size of support of the LASSO minimizer maintains a comparable sparsity, s 6 Cδk, and new `2/`1 error bounds are derived which highlight precise dependence on k and on the LassO parameter λ, before the error is driven below the scale of negligible measurement/ and compressiblity errors.

On the Best Choice of Lasso Program Given Data Parameters

- Computer ScienceIEEE Transactions on Information Theory
- 2022

It is proved a gauge-constrained Lasso program admits asymptotic cusp-like behaviour of its risk in the limiting low-noise regime, and it is proved that a residual- ConstrainedLasso program has asymPTotically suboptimal risk for very sparse vectors.

Sub‐Gaussian Matrices on Sets: Optimal Tail Dependence and Applications

- Computer ScienceCommunications on Pure and Applied Mathematics
- 2021

Random linear mappings are widely used in modern signal processing, compressed sensing, and machine learning. These mappings may be used to embed the data into a significantly lower dimension while…

Correcting for unknown errors in sparse high-dimensional function approximation

- Computer ScienceNumerische Mathematik
- 2019

The lesser-known square-root LASSO is better suited for high-dimensional approximation than the other procedures in the case of bounded noise, since it avoids (both theoretically and numerically) the need for parameter tuning.

SIMULTANEOUS ANALYSIS OF LASSO AND DANTZIG SELECTOR

- Computer Science
- 2009

We show that, under a sparsity scenario, the Lasso estimator and the Dantzig selector exhibit similar behavior. For both methods, we derive, in parallel, oracle inequalities for the prediction risk…

The Lasso Problem and Uniqueness

- Computer Science, Mathematics
- 2012

The LARS algorithm is extended to cover the non-unique case, so that this path algorithm works for any predictor matrix and a simple method is derived for computing the component-wise uncertainty in lasso solutions of any given problem instance, based on linear programming.

An Introduction to Compressed Sensing

- Computer Science
- 2019

This book aims to provide an in-depth initiation to the field of compressed sensing and specific topics include material on graph theory and the design of binary-measurement matrices, matrix recovery and completion, and optimization algorithms.