Corpus ID: 18067582

A remark on weaken restricted isometry property in compressed sensing

@article{Zhang2015ARO,
  title={A remark on weaken restricted isometry property in compressed sensing},
  author={Hui Zhang},
  journal={ArXiv},
  year={2015},
  volume={abs/1504.00086}
}
  • Hui Zhang
  • Published 31 March 2015
  • Computer Science, Mathematics
  • ArXiv
The restricted isometry property (RIP) has become well-known in the compressed sensing community. Recently, a weaken version of RIP was proposed for exact sparse recovery under weak moment assumptions. In this note, we prove that the weaken RIP is also sufficient for \textsl{stable and robust} sparse recovery by linking it with a recently introduced robust width property in compressed sensing. Moreover, we show that it can be widely apply to other compressed sensing instances as well. 

References

SHOWING 1-10 OF 11 REFERENCES
Sparse recovery under weak moment assumptions
We prove that iid random vectors that satisfy a rather weak moment assumption can be used as measurement vectors in Compressed Sensing, and the number of measurements required for exactExpand
Robust width: A characterization of uniformly stable and robust compressed sensing
TLDR
The characterized sensing operators satisfy a new property the authors call the robust width property, which simultaneously captures notions of widths from approximation theory and of restricted eigenvalues from statistical regression. Expand
On robust width property for Lasso and Dantzig selector
  • Hui Zhang
  • Mathematics, Computer Science
  • ArXiv
  • 2015
TLDR
The robust width property can be perfectly applied to the Lasso and Dantzig selector models, both of which are popular alternatives in the statistics community and solve an open problem left by Cahill and Mixon. Expand
Compressed sensing and best k-term approximation
The typical paradigm for obtaining a compressed version of a discrete signal represented by a vector x ∈ R is to choose an appropriate basis, compute the coefficients of x in this basis, and thenExpand
The lower tail of random quadratic forms with applications to ordinary least squares
TLDR
This paper proves that the “lower tail” of such a matrix is sub-Gaussian under a simple fourth moment assumption on the one-dimensional marginals of the random vectors, and obtains a nearly optimal finite-sample result for the ordinary least squares estimator under random design. Expand
Fundamental Performance Limits for Ideal Decoders in High-Dimensional Linear Inverse Problems
TLDR
This paper characterizes the fundamental performance limits that can be expected from an ideal decoder given a general model, i.e., a general subset of simple vectors of interest, and defines instance optimality relatively to a model much beyond the traditional framework of sparse recovery. Expand
A Mathematical Introduction to Compressive Sensing
TLDR
A Mathematical Introduction to Compressive Sensing gives a detailed account of the core theory upon which the field is build and serves as a reliable resource for practitioners and researchers in these disciplines who want to acquire a careful understanding of the subject. Expand
The Convex Geometry of Linear Inverse Problems
TLDR
This paper provides a general framework to convert notions of simplicity into convex penalty functions, resulting in convex optimization solutions to linear, underdetermined inverse problems. Expand
Decoding by linear programming
  • E. Candès, T. Tao
  • Computer Science, Mathematics
  • IEEE Transactions on Information Theory
  • 2005
TLDR
F can be recovered exactly by solving a simple convex optimization problem (which one can recast as a linear program) and numerical experiments suggest that this recovery procedure works unreasonably well; f is recovered exactly even in situations where a significant fraction of the output is corrupted. Expand
Decoding by linear programming Information Theory
  • IEEE Transactions on
  • 2005
...
1
2
...