Square-Root Lasso: Pivotal Recovery of Sparse Signals via Conic Programming

@inproceedings{Belloni2010SquareRootLP,
  title={Square-Root Lasso: Pivotal Recovery of Sparse Signals via Conic Programming},
  author={Alexandre Belloni and Victor Chernozhukov and Lie Wang},
  year={2010}
}
We propose a pivotal method for estimating high-dimensional sparse linear regression models, where the overall number of regressors p is large, possibly much larger than n, but only s regressors are significant. The method is a modification of Lasso, called square-root Lasso. The method neither relies on the knowledge of the standard deviation σ of the regression errors nor does it need to pre-estimate σ. Despite not knowing σ, square-root Lasso achieves near-oracle performance, attaining the… Expand

Figures from this paper

Insights and algorithms for the multivariate square-root lasso.
We study the multivariate square-root lasso, a method for fitting the multivariate response linear regression model with dependent errors. This estimator minimizes the nuclear norm of the residualExpand
Pivotal estimation via square-root Lasso in nonparametric regression
We propose a self-tuning $\sqrt{\mathrm {Lasso}}$ method that simultaneously resolves three important practical problems in high-dimensional regression analysis, namely it handles the unknown scale,Expand
New insights for the multivariate square-root lasso
We study the multivariate square-root lasso, a method for fitting the multivariate response (multi-task) linear regression model with dependent errors. This estimator minimizes the nuclear norm ofExpand
Sparse Recovery With Unknown Variance: A LASSO-Type Approach
TLDR
Two least absolute shrinkage and selection operator (LASSO)-type methods that jointly estimate β and the variance are studied, showing that the first estimator enjoys nearly the same performances in practice as the standard LASSO for a wide range of the signal-to-noise ratio. Expand
Concomitant Lasso with Repetitions (CLaR): beyond averaging multiple realizations of heteroscedastic noise
TLDR
This work proposes an estimator that can cope with complex heteroscedastic noise structures by using non-averaged measurements and a con-comitant formulation that is amenable to state-of-the-art proximal coordinate descent techniques that can leverage the expected sparsity of the solutions. Expand
On Regularized Square-root Regression Problems: Distributionally Robust Interpretation and Fast Computations
  • Hong T.M. Chu, K. Toh, Yangjing Zhang
  • Mathematics
  • 2021
Square-root (loss) regularized models have recently become popular in linear regression due to their nice statistical properties. Moreover, some of these models can be interpreted as theExpand
The Group Square-Root Lasso: Theoretical Properties and Fast Algorithms
TLDR
The group square-root lasso (GSRL) method for estimation in high dimensional sparse regression models with group structure is introduced, and it is shown that the GSRL estimator adapts to the unknown sparsity of the regression vector, and has the same optimal estimation and prediction accuracy as the GL estimators, under the same minimal conditions on the model. Expand
Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization
Least squares fitting is in general not useful for high-dimensional linear models, in which the number of predictors is of the same or even larger order of magnitude than the number of samples.Expand
Learning Heteroscedastic Models by Convex Programming under Group Sparsity
TLDR
This paper proposes a new approach to the joint estimation of the conditional mean and the conditional variance in a high-dimensional (auto-) regression setting by solving a second-order cone program (SOCP). Expand
Regularized High-Dimensional Sparse Regression
In this project, we discuss high-dimensional regression, where the dimension of the multivariate distribution is larger than the sample size, i.e. d n. With the assumption of sparse structure of theExpand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 42 REFERENCES
LASSO-TYPE RECOVERY OF SPARSE REPRESENTATIONS FOR HIGH-DIMENSIONAL DATA
The Lasso [28] is an attractive technique for regularization and variable selection for high-dimensional data, where the number of predictor variables p is potentially much larger than the number ofExpand
Taking Advantage of Sparsity in Multi-Task Learning
TLDR
The Group Lasso is considered as a candidate estimation method and it is shown that this estimator enjoys nice sparsity oracle inequalities and variable selection properties and can be extended to more general noise distributions, of which it only requires the variance to be finite. Expand
L1-Penalized Quantile Regression in High Dimensional Sparse Models
We consider median regression and, more generally, quantile regression in high-dimensional sparse models. In these models the overall number of regressors p is very large, possibly larger than theExpand
The sparsity and bias of the Lasso selection in high-dimensional linear regression
Meinshausen and Buhlmann [Ann. Statist. 34 (2006) 1436-1462] showed that, for neighborhood selection in Gaussian graphical models, under a neighborhood stability condition, the LASSO is consistent,Expand
On Model Selection Consistency of Lasso
  • P. Zhao, Bin Yu
  • Mathematics, Computer Science
  • J. Mach. Learn. Res.
  • 2006
TLDR
It is proved that a single condition, which is called the Irrepresentable Condition, is almost necessary and sufficient for Lasso to select the true model both in the classical fixed p setting and in the large p setting as the sample size n gets large. Expand
Templates for convex cone problems with applications to sparse signal recovery
TLDR
A general framework for solving a variety of convex cone problems that frequently arise in signal processing, machine learning, statistics, and other fields, and results showing that the smooth and unsmoothed problems are sometimes formally equivalent are applied. Expand
On sparse reconstruction from Fourier and Gaussian measurements
This paper improves upon best-known guarantees for exact reconstruction of a sparse signal f from a small universal sample of Fourier measurements. The method for reconstruction that has recentlyExpand
Regression Shrinkage and Selection via the Lasso
SUMMARY We propose a new method for estimation in linear models. The 'lasso' minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than aExpand
Adaptive estimation of a quadratic functional by model selection
We consider the problem of estimating ∥s∥ 2 when s belongs to some separable Hilbert space and one observes the Gaussian process Y(t) = (s, t) + σ L(t), for all t ∈ H, where L is some GaussianExpand
SIMULTANEOUS ANALYSIS OF LASSO AND DANTZIG SELECTOR
We show that, under a sparsity scenario, the Lasso estimator and the Dantzig selector exhibit similar behavior. For both methods, we derive, in parallel, oracle inequalities for the prediction riskExpand
...
1
2
3
4
5
...