L1-Regularized Least Squares for Support Recovery of High Dimensional Single Index Models with Gaussian Designs
The problem of consistently estimating the sparsity pattern of a vector based on observations contaminated by noise arises in various contexts, including signal denoising, sparse approximation, compressed sensing, and model selection. We analyze the behavior of -constrained quadratic programming (QP), also referred to as the Lasso, for recovering the sparsity pattern. Our main result is to establish precise conditions on the problem dimension , the number of nonzero elements in , and the number of observations that are necessary and sufficient for sparsity pattern recovery using the Lasso. We first analyze the case of observations made using deterministic design matrices and sub-Gaussian additive noise, and provide sufficient conditions for support recovery and -error bounds, as well as results showing the necessity of incoherence and bounds on the minimum value. We then turn to the case of random designs, in which each row of the design is drawn from a ensemble. For a broad class of Gaussian ensembles satisfying mutual incoherence conditions, we compute explicit values of thresholds with the following properties: for any , if , then the Lasso succeeds in recovering the sparsity pattern with probability converging to one for large problems, whereas for , then the probability of successful recovery converges to zero. For the special case of the uniform Gaussian ensemble , we show that , so that the precise threshold is exactly determined.