Stable signal recovery from incomplete and inaccurate measurements

@article{Cands2005StableSR,
  title={Stable signal recovery from incomplete and inaccurate measurements},
  author={Emmanuel J. Cand{\`e}s and Justin K. Romberg and Terence Tao},
  journal={Communications on Pure and Applied Mathematics},
  year={2005},
  volume={59}
}
Suppose we wish to recover a vector x0 ∈ ℝ𝓂 (e.g., a digital signal or image) from incomplete and contaminated observations y = A x0 + e; A is an 𝓃 × 𝓂 matrix with far fewer rows than columns (𝓃 ≪ 𝓂) and e is an error term. Is it possible to recover x0 accurately based on the data y? 

Figures and Tables from this paper

` 1 minimisation with noisy data
Compressed sensing aims at recovering a sparse signal x ∈ RN from few nonadaptive, linear measurements Φ(x) given by a measurement matrix Φ. One of the fundamental recovery algorithms is an `1
Signal recovery from incomplete measurements in the presence of outliers
We study the restoration of a sparse signal or an image with a sparse gradient from a relatively small number of linear measurements which are additionally corrupted by a small amount of white
General Perturbations in Compressed Sensing
We analyze the Basis Pursuit recovery of signals when observing K-sparse data with general perturbations (i.e., additive, as well as multiplicative noise). This completely perturbed model extends the
Stable signal recovery from incomplete observations
TLDR
This paper describes how stable recovery via convex optimization can be implemented in an efficient manner, and presents some numerical results illustrating the practicality of the procedure.
SPARSE SIGNAL AND IMAGE RECOVERY FROM COMPRESSIVE SAMPLES
In this paper we present an introduction to compressive sampling (CS), an emerging model-based framework for data acquisition and signal recovery based on the premise that a signal having a sparse
Robust Signal Recovery from Incomplete Observations
TLDR
This paper shows how the recovery via convex optimization can be implemented in an efficient manner, and presents some numerical results illustrating the practicality of the procedure.
General Perturbations of Sparse Signals in Compressed Sensing
TLDR
The results show that the stability of the recovered signal is limited by the noise level in the observation, and this accuracy is within a constant multiple of the best-case reconstruction using the technique of least squares.
Sparse Signal Recovery via Correlated Degradation Model
TLDR
Approximate Message Passing (AMP) frameworks use denoising algorithms as regularizers (priors) for model-based inversion via the alternating direction method of multipliers (ADMM), and the idea of ADMM is to convert an unconstrained optimization problem x̂ = argminx f (x) + λg(x) into its equivalent constrained form which is then decoupled into two separate optimizations.
Exact signal recovery from sparsely corrupted measurements through the Pursuit of Justice
TLDR
It is demonstrated that a simple algorithm, which is dubbed Justice Pursuit (JP), can achieve exact recovery from measurements corrupted with sparse noise.
Estimation of sparse signal by non-convex optimization
TLDR
This paper addresses the problem of recover signals from undersampled data where such signals are not sparse in an orthonormal basis, but in an overcomplete dictionary, and shows that if the combined matrix obeys a certain restricted isometry property and if the signal is sufficiently sparse, the reconstruction that rely on ℓp minimization with 0 < p & #60; 1 is exact.
...
...

References

SHOWING 1-10 OF 22 REFERENCES
Compressed sensing
  • D. Donoho
  • Mathematics
    IEEE Transactions on Information Theory
  • 2006
TLDR
It is possible to design n=O(Nlog(m)) nonadaptive measurements allowing reconstruction with accuracy comparable to that attainable with direct knowledge of the N most important coefficients, and a good approximation to those N important coefficients is extracted from the n measurements by solving a linear program-Basis Pursuit in signal processing.
Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
TLDR
If the objects of interest are sparse in a fixed basis or compressible, then it is possible to reconstruct f to within very high accuracy from a small number of random measurements by solving a simple linear program.
Stable recovery of sparse overcomplete representations in the presence of noise
TLDR
This paper establishes the possibility of stable recovery under a combination of sufficient sparsity and favorable structure of the overcomplete system and shows that similar stability is also available using the basis and the matching pursuit algorithms.
Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ1 minimization
  • D. Donoho, Michael Elad
  • Computer Science
    Proceedings of the National Academy of Sciences of the United States of America
  • 2003
TLDR
This article obtains parallel results in a more general setting, where the dictionary D can arise from two or several bases, frames, or even less structured systems, and sketches three applications: separating linear features from planar ones in 3D data, noncooperative multiuser encoding, and identification of over-complete independent component models.
Just relax: convex programming methods for identifying sparse signals in noise
  • J. Tropp
  • Computer Science
    IEEE Transactions on Information Theory
  • 2006
TLDR
A method called convex relaxation, which attempts to recover the ideal sparse signal by solving a convex program, which can be completed in polynomial time with standard scientific software.
Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
TLDR
It is shown how one can reconstruct a piecewise constant object from incomplete frequency samples - provided that the number of jumps (discontinuities) obeys the condition above - by minimizing other convex functionals such as the total variation of f.
For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution
TLDR
The techniques include the use of random proportional embeddings and almost‐spherical sections in Banach space theory, and deviation bounds for the eigenvalues of random Wishart matrices.
Decoding by linear programming
TLDR
F can be recovered exactly by solving a simple convex optimization problem (which one can recast as a linear program) and numerical experiments suggest that this recovery procedure works unreasonably well; f is recovered exactly even in situations where a significant fraction of the output is corrupted.
Uncertainty principles and ideal atomic decomposition
TLDR
It is proved that if S is representable as a highly sparse superposition of atoms from this time-frequency dictionary, then there is only one such highly sparse representation of S, and it can be obtained by solving the convex optimization problem of minimizing the l/sup 1/ norm of the coefficients among all decompositions.
...
...