On the Performance of Sparse Recovery Via lp-Minimization (0 <= p <= 1)
@article{Wang2011OnTP, title={On the Performance of Sparse Recovery Via lp-Minimization (0 <= p <= 1)}, author={Meng Wang and Weiyu Xu and Ao Tang}, journal={IEEE Trans. Inf. Theory}, year={2011}, volume={57}, pages={7255-7278} }
It is known that a high-dimensional sparse vector x* in TV can be recovered from low-dimensional measurements y = Ax* where Am×n(m <; n) is the measurement matrix. In this paper, with A being a random Gaussian matrix, we investigate the recovering ability of lp-minimization (0 ≤ p ≤ 1) as p varies, where lp-minimization returns a vector with the least lp quasi norm among all the vectors x satisfying Ax = y. Besides analyzing the performance of strong recovery where lp-minimization is re quired…
Figures from this paper
55 Citations
A perturbation inequality for concave functions of singular values and its applications in low-rank matrix recovery
- Mathematics
- 2016
A simplified approach to recovery conditions for low rank matrices
- Computer Science2011 IEEE International Symposium on Information Theory Proceedings
- 2011
This paper shows how several classes of recovery conditions can be extended from vectors to matrices in a simple and transparent way, leading to the best known restricted isometry and nullspace conditions for matrix recovery.
Guarantees of total variation minimization for signal recovery
- Computer Science2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton)
- 2013
The proof for the performance guarantee of total variation (TV) minimization in recovering one-dimensional signal with sparse gradient support is established and it is shown that the recoverable gradient sparsity can grow linearly with the signal dimension when TV minimization is used.
Exact Recovery Conditions for Sparse Representations With Partial Support Information
- Computer ScienceIEEE Transactions on Information Theory
- 2013
A finer analysis is carried out based on the null space property (NSP) and the exact recovery condition (ERC) and connections are established regarding the characterization of ℓp-relaxation procedures and OMP in the informed setup.
Convergence and Stability of Iteratively Re-weighted Least Squares Algorithms for Sparse Signal Recovery in the Presence of Noise
- Computer Science
- 2012
The simplicity of IRLS, along with the theoretical guarantees provided in this contribution, make a compelling case for its adoption as a standard tool for sparse signal recovery.
Convergence and stability of iteratively reweighted least squares for low-rank matrix recovery
- Computer Science
- 2017
The simplicity of IRLS-M, along with the theoretical guarantees provided in this paper, make a compelling case for its adoption as a standard tool for low rank matrix recovery.
Exact Low-Rank Matrix Recovery via Nonconvex Schatten P-Minimization
- Computer ScienceAsia Pac. J. Oper. Res.
- 2013
The main aim is to establish RIP theoretical result for exact LMR via nonconvex Schatten p-minimized relaxation of LMR in magnetic resonance imaging.
Gaussian Mixtures Based IRLS for Sparse Recovery With Quadratic Convergence
- Computer Science, MathematicsIEEE Transactions on Signal Processing
- 2015
It is shown through numerical experiments that the proposed methods outperform classical IRLS for lτ-minimization with τ ∈ (0,1] in terms of speed and of sparsity-undersampling tradeoff and are robust even in presence of noise.
Convergence and Stability of Iteratively Re-weighted Least Squares Algorithms
- Computer ScienceIEEE Transactions on Signal Processing
- 2014
A one-to-one correspondence between the IRLS algorithms and a class of Expectation-Maximization algorithms for constrained maximum likelihood estimation under a Gaussian scale mixture (GSM) distribution is demonstrated.
Convergence and stability analysis of iteratively reweighted least squares for noisy block sparse recovery
- Computer ScienceLinear Algebra and its Applications
- 2021
References
SHOWING 1-10 OF 40 REFERENCES
Restricted Isometry Constants Where Lp Sparse Recovery Can Fail for 0
- Business
- 2008
This paper investigates conditions under which the solution of an underdetermined linear system with minimal ` norm, 0 < p ≤ 1, is guaranteed to be also the sparsest one. The results highlight the…
THE RESTRICTED ISOMETRY PROPERTY AND ` Q-REGULARIZATION : PHASE TRANSITIONS FOR SPARSE APPROXIMATION
- Mathematics
- 2009
Consider a measurement matrix A of size n×N , with n < N , y a signal in RN , and b = Ay the observed measurement of the vector y. From knowledge of (b, A), compressed sensing seeks to recover the…
Dense error correction via l1-minimization
- Computer ScienceICASSP
- 2009
It is proved that for highly correlated dictionaries A, any non-negative, sufficiently sparse signal x can be recovered by solving an l1-minimization problem: min ‖x‖ 1 + ‖e‚ 1 subject to y = Ax + e, which suggests that accurate and efficient recovery of sparse signals is possible even with nearly 100% of the observations corrupted.
Compressive Sensing over the Grassmann Manifold: a Unified Geometric Framework
- Computer ScienceArXiv
- 2010
This paper gives a unified null space Grassmann angle-based geometric framework for analyzing the performance of l_1 minimization and investigates the "balancedness" property of linear subspaces, which characterizes sharp quantitative tradeoffs between the considered sparsity and the recovery accuracy of the l-1 optimization.
Restricted Isometry Constants Where $\ell ^{p}$ Sparse Recovery Can Fail for $0≪ p \leq 1$
- Computer ScienceIEEE Transactions on Information Theory
- 2009
Investigating conditions under which the solution of an underdetermined linear system with minimal lscrp norm, 0 < p les 1, is guaranteed to be also the sparsest one shows that there is limited room for improving over the best known positive results of Foucart and Lai.
A simple performance analysis of ℓ1 optimization in compressed sensing
- Computer Science2009 IEEE International Conference on Acoustics, Speech and Signal Processing
- 2009
A novel, very simple technique is introduced for proving that if the number of measurements is proportional to the length of the signal then there is a sparsity of the unknown signal proportional to its length for which the success of the l1 optimization is guaranteed.
Sparse Recovery Using Sparse Random Matrices
- Computer ScienceLATIN
- 2010
An overview of the results in the area of sequential Sparse Matching Pursuit, and describes a new algorithm, called “SSMP”, which works well on real data, with the recovery quality often outperforming that of more complex algorithms, such as l1 minimization.
Compressed sensing and best k-term approximation
- Mathematics, Computer Science
- 2008
The typical paradigm for obtaining a compressed version of a discrete signal represented by a vector x ∈ R is to choose an appropriate basis, compute the coefficients of x in this basis, and then…
Compressed sensing - probabilistic analysis of a null-space characterization
- Computer Science2008 IEEE International Conference on Acoustics, Speech and Signal Processing
- 2008
What matters is not so much the distribution from which the entries of the measurement matrix A are drawn, but rather the statistics of the null-space of A, and an alternative proof of the main result of Candes and Tao [2005] is provided by analyzing matrices whosenull-space is isotropic.
Sparse nonnegative solution of underdetermined linear equations by linear programming.
- MathematicsProceedings of the National Academy of Sciences of the United States of America
- 2005
It is shown that outward k-neighborliness is equivalent to the statement that, whenever y = Ax has a non negative solution with at most k nonzeros, it is the nonnegative solution to y =Ax having minimal sum.