Stable recovery of sparse overcomplete representations in the presence of noise

@article{Donoho2006StableRO,
  title={Stable recovery of sparse overcomplete representations in the presence of noise},
  author={David L. Donoho and Michael Elad and Vladimir N. Temlyakov},
  journal={IEEE Transactions on Information Theory},
  year={2006},
  volume={52},
  pages={6-18}
}
Overcomplete representations are attracting interest in signal processing theory, particularly due to their potential to generate sparse representations of signals. However, in general, the problem of finding sparse representations must be unstable in the presence of noise. This paper establishes the possibility of stable recovery under a combination of sufficient sparsity and favorable structure of the overcomplete system. Considering an ideal underlying signal that has a sufficiently sparse… 

Figures from this paper

Recovery of exact sparse representations in the presence of bounded noise
  • J. Fuchs
  • Computer Science
    IEEE Transactions on Information Theory
  • 2005
TLDR
The purpose of this contribution is to extend some recent results on sparse representations of signals in redundant bases developed in the noise-free case to the case of noisy observations, finding a bound on the number of nonzero entries in xo.
On the Stable Recovery of the Sparsest Overcomplete Representations in Presence of Noise
TLDR
All unique sparse decompositions are stably recoverable and the stability bound from ∥s∥<sub>0</sub> <;; (1/2)spark(A) is extended to the whole uniqueness range, which guarantees that the sparse decomposition can be found via minimizing the l<sup>1</sup> norm.
Further Results on Stable Recovery of Sparse Overcomplete Representations in the Presence of Noise
  • P. Tseng
  • Computer Science
    IEEE Transactions on Information Theory
  • 2009
TLDR
This work sharpen the approximation bounds under more relaxed conditions of the sparsest representation of the over complete dictionary and derives analogous results for a stepwise projection algorithm.
MMSE Approximation For Sparse Coding Algorithms Using Stochastic Resonance
TLDR
This paper adds controlled noise to the input and estimates a sparse representation from the perturbed signal and shows that both methods provide a computationally efficient approximation to the MMSE estimator, which is typically intractable to compute.
On Sparsity, Redundancy and Quality of Frame Representations
TLDR
A lower bound is established on the trade-off between the sparsity of the representation, the underlying distortion and the redundancy of any given frame in the Vandermonde frame.
On the Minimal Overcompleteness Allowing Universal Sparse Representation
TLDR
This paper studies how redundant a dictionary must be so as to allow any vector to admit a sparse approximation with a prescribed sparsity and a prescribed level of accuracy, and finds that the required overcompleteness grows exponentially with the sparsity level and polynomially with the allowed representation error.
Sparsity pattern recovery using FRI methods
TLDR
An extension that takes advantage of the even symmetry of the cosine functions to build an algorithm that can operate over the observed vector and in a dual domain and outperforms state of the art algorithms in a number scenarios is presented.
Estimation of sparse distributions
TLDR
An efficient sparse signal recovery algorithm is developed and in most settings the extended algorithm outperforms other conventional algorithms with a large margin, and the applicability of the proposed algorithms for solving some practical problems is explored.
Universal Sparse Representation
TLDR
This paper studies how redundant a dictionary must be so as to allow any vector to admit a sparse approximation with a prescribed sparsity and a prescribed level of accuracy, and finds that the required overcompleteness grows exponentially with the sparsity level and polynomially with the allowed representation error.
A Weighted Average of Sparse Representations is Better than the Sparsest One Alone
TLDR
It is shown that while the Maximum a-posterior Probability estimator aims to find and use the sparsest representation, the Minimum Mean-Squared-Error (MMSE) estimator leads to a fusion of representations to form its result, which is a far more accurate estimation, especially at medium and low SNR.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 63 REFERENCES
Recovery of exact sparse representations in the presence of bounded noise
  • J. Fuchs
  • Computer Science
    IEEE Transactions on Information Theory
  • 2005
TLDR
The purpose of this contribution is to extend some recent results on sparse representations of signals in redundant bases developed in the noise-free case to the case of noisy observations, finding a bound on the number of nonzero entries in xo.
Greed is good: algorithmic results for sparse approximation
  • J. Tropp
  • Computer Science
    IEEE Transactions on Information Theory
  • 2004
TLDR
This article presents new results on using a greedy algorithm, orthogonal matching pursuit (OMP), to solve the sparse approximation problem over redundant dictionaries and develops a sufficient condition under which OMP can identify atoms from an optimal approximation of a nonsparse signal.
Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ1 minimization
  • D. Donoho, Michael Elad
  • Computer Science
    Proceedings of the National Academy of Sciences of the United States of America
  • 2003
TLDR
This article obtains parallel results in a more general setting, where the dictionary D can arise from two or several bases, frames, or even less structured systems, and sketches three applications: separating linear features from planar ones in 3D data, noncooperative multiuser encoding, and identification of over-complete independent component models.
Sparse image representation via combined transforms
TLDR
This work considers sparse image decomposition, in the hope that a sparser decomposition of an image may lead to a more efficient method of image coding or compression and takes advantage of the recent advances in convex optimization and iterative methods.
On sparse representations in arbitrary redundant bases
  • J. Fuchs
  • Mathematics
    IEEE Transactions on Information Theory
  • 2004
TLDR
The purpose of this contribution is to generalize some recent results on sparse representations of signals in redundant bases and give a sufficient condition for the unique sparsest solution to be the unique solution to both a linear program or a parametrized quadratic program.
JUST RELAX: CONVEX PROGRAMMING METHODS FOR SUBSET SELECTION AND SPARSE APPROXIMATION
TLDR
It is demonstrated that the solution of the convex program frequently coincides with the solutionof the original approximation problem, and comparable new results for a greedy algorithm, Orthogonal Matching Pursuit, are stated.
Efficient backward elimination algorithm for sparse signal representation using overcomplete dictionaries
TLDR
The backward elimination sparse representation algorithm presented by Reeves is extended to allow for an overcomplete dictionary and develop recursions for its implementation.
Noise sensitivity of sparse signal representations: reconstruction error bounds for the inverse problem
  • B. Wohlberg
  • Computer Science
    IEEE Trans. Signal Process.
  • 2003
TLDR
Certain sparse signal reconstruction problems have been shown to have unique solutions when the signal is known to have an exact sparse representation, and uniqueness is found to be extremely unstable for a number of common dictionaries.
Optimal sub-Nyquist nonuniform sampling and reconstruction for multiband signals
TLDR
It is found that optimizing the reconstruction sections of the system, choosing the optimal base sampling rate, and designing the nonuniform sampling pattern can improve system performance significantly and uniform sampling is optimal for signals with /spl Fscr/ that tiles under translation.
Sparse signal reconstruction from limited data using FOCUSS: a re-weighted minimum norm algorithm
TLDR
A view of the algorithm as a novel optimization method which combines desirable characteristics of both classical optimization and learning-based algorithms is provided and Mathematical results on conditions for uniqueness of sparse solutions are also given.
...
1
2
3
4
5
...