Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using -Constrained Quadratic Programming (Lasso)

Abstract

—The problem of consistently estimating the sparsity pattern of a vector 3 2 p based on observations contaminated by noise arises in various contexts, including signal denoising, sparse approximation, compressed sensing, and model selection. We analyze the behavior of`1-constrained quadratic programming (QP), also referred to as the Lasso, for recovering the sparsity pattern. Our main result is to establish precise conditions on the problem dimension p, the number k of nonzero elements in 3 , and the number of observations n that are necessary and sufficient for sparsity pattern recovery using the Lasso. We first analyze the case of observations made using deterministic design matrices and sub-Gaussian additive noise, and provide sufficient conditions for support recovery and`1-error bounds, as well as results showing the necessity of incoherence and bounds on the minimum value. We then turn to the case of random designs, in which each row of the design is drawn from a N(0; 6) ensemble. For a broad class of Gaussian ensembles satisfying mutual in-coherence conditions, we compute explicit values of thresholds 0 < ` (6) u(6) < +1 with the following properties: for any > 0, if n > 2(u +)k log(p 0 k), then the Lasso succeeds in recovering the sparsity pattern with probability converging to one for large problems, whereas for n < 2(` 0)k log(p 0k), then the probability of successful recovery converges to zero. For the special case of the uniform Gaussian ensemble (6 = Ip2p), we show that`= u = 1, so that the precise threshold n = 2k log(p 0 k) is exactly determined.

Extracted Key Phrases

1 Figure or Table

Showing 1-10 of 229 extracted citations
0204060802008200920102011201220132014201520162017
Citations per Year

347 Citations

Semantic Scholar estimates that this publication has received between 285 and 427 citations based on the available data.

See our FAQ for additional information.