Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ1 minimization

@article{Donoho2003OptimallySR,
  title={Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ1 minimization},
  author={David L. Donoho and Michael Elad},
  journal={Proceedings of the National Academy of Sciences of the United States of America},
  year={2003},
  volume={100},
  pages={2197 - 2202}
}
  • D. Donoho, Michael Elad
  • Published 21 February 2003
  • Computer Science, Medicine
  • Proceedings of the National Academy of Sciences of the United States of America
Given a dictionary D = {d(k)} of vectors d(k), we seek to represent a signal S as a linear combination S = summation operator(k) gamma(k)d(k), with scalar coefficients gamma(k. [...] Key Result We sketch three applications: separating linear features from planar ones in 3D data, noncooperative multiuser encoding, and identification of over-complete independent component models.Expand
On the uniqueness of overcomplete dictionaries, and a practical way to retrieve them
Abstract A full-rank under-determined linear system of equations Ax  =  b has in general infinitely many possible solutions. In recent years there is a growing interest in the sparsest solution of
K-SVD : An Algorithm for Designing of Overcomplete Dictionaries for Sparse Representation
In recent years there has been a growing interest in the study of sparse representation of signals. Using an overcomplete dictionary that contains prototype signal-atoms, signals are described by
$rm K$-SVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation
TLDR
A novel algorithm for adapting dictionaries in order to achieve sparse signal representations, the K-SVD algorithm, an iterative method that alternates between sparse coding of the examples based on the current dictionary and a process of updating the dictionary atoms to better fit the data.
Dictionary Identification-Sparse Matrix-Factorisation via _1-Minimisation
This article treats the problem of learning a dictionary providing sparse representations for a given signal class, via `1-minimisation. The problem can also be seen as factorising a d × N matrix Y =
Chapter 1 OPTIMIZATION ALGORITHMS FOR SPARSE REPRESENTATIONS AND APPLICATIONS
We consider the following sparse representation problem, which is called Sparse Component Analysis: identify the matrices S ∈ IRn×N and A ∈ IRm×n (m ≤ n < N) uniquely (up to permutation of scaling),
Robust and Fast Learning of Sparse Codes With
TLDR
It is shown that simple stochastic gradient descent leads to superior dictionaries compared to the Method of Optimal Directions (MOD) and the K-SVD algorithm and how the Bag of Pursuits and a generalized version of the Neural Gas algorithm can be used to derive an even more powerful method for sparse coding.
Computing Sparse Representations of Multidimensional Signals Using Kronecker Bases
TLDR
This letter generalizes the theory of sparse representations of vectors to multiway arrays (tensors)—signals with a multidimensional structure—by using the Tucker model to derive a very fast and memory-efficient algorithm called N-BOMP (N-way block OMP), and theoretically demonstrates that under the block-sparsity assumption, this algorithm not only has a considerably lower complexity but is also more precise than the classic OMP algorithm.
Robust and Fast Learning of Sparse Codes With Stochastic Gradient Descent
TLDR
The so-called Bag of Pursuits method is introduced as an extension of Orthogonal Matching Pursuit and it is shown that it provides an improved approximation of the optimal sparse coefficients and significantly improves the performance of the here proposed gradient descent as well as of the MOD and K-SVD approaches.
Some recovery conditions for basis learning by L1-minimization
  • R. Gribonval, K. Schnass
  • Mathematics
    2008 3rd International Symposium on Communications, Control and Signal Processing
  • 2008
Many recent works have shown that if a given signal admits a sufficiently sparse representation in a given dictionary, then this representation is recovered by several standard optimization
On Polar Polytopes and the Recovery of Sparse Representations
  • Mark D. Plumbley
  • Computer Science, Mathematics
    IEEE Transactions on Information Theory
  • 2007
TLDR
This paper explores the sparse representation problem using the geometry of convex polytopes, and finds that the so-called polar polytope P* of the centrally symmetric polytopes P whose vertices are the atom pairs plusmnai is particularly helpful in providing geometrical insight into optimality conditions given by Fuchs and Tropp for non-unit-norm atom sets.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 53 REFERENCES
Sparse image representation via combined transforms
TLDR
This work considers sparse image decomposition, in the hope that a sparser decomposition of an image may lead to a more efficient method of image coding or compression and takes advantage of the recent advances in convex optimization and iterative methods.
Sparse representations in unions of bases
TLDR
It is proved that the result of Donoho and Huo, concerning the replacement of the /spl lscr//sup 0/ optimization problem with a linear programming problem when searching for sparse representations has an analog for dictionaries that may be highly redundant.
A generalized uncertainty principle and sparse representation in pairs of bases
TLDR
The main contribution in this paper is the improvement of an important result due to Donoho and Huo (2001) concerning the replacement of the l/sub 0/ optimization problem by a linear programming minimization when searching for the unique sparse representation.
Atomic Decomposition by Basis Pursuit
TLDR
Basis Pursuit (BP) is a principle for decomposing a signal into an "optimal" superposition of dictionary elements, where optimal means having the smallest l1 norm of coefficients among all such decompositions.
Uncertainty principles and ideal atomic decomposition
TLDR
It is proved that if S is representable as a highly sparse superposition of atoms from this time-frequency dictionary, then there is only one such highly sparse representation of S, and it can be obtained by solving the convex optimization problem of minimizing the l/sup 1/ norm of the coefficients among all decompositions.
Matching pursuits with time-frequency dictionaries
The authors introduce an algorithm, called matching pursuit, that decomposes any signal into a linear expansion of waveforms that are selected from a redundant dictionary of functions. These
Tensor Methods in Statistics
This book provides a systematic development of tensor methods in statistics, beginning with the study of multivariate moments and cumulants. The effect on moment arrays and on cumulant arrays of
Strictly positive definite functions on spheres in Euclidean spaces
TLDR
This paper studies strictly positive definite functions on the unit sphere of the m-dimensional Euclidean space and invokes the realization of harmonic polynomials as the polynomial kernel of the Laplacian, thereby exploiting some basic relations between homogeneous ideals and their polynometric kernels.
Matrix analysis
TLDR
This new edition of the acclaimed text presents results of both classic and recent matrix analyses using canonical forms as a unifying theme, and demonstrates their importance in a variety of applications.
The curvelet transform for image denoising
We describe approximate digital implementations of two new mathematical transforms, namely, the ridgelet transform and the curvelet transform. Our implementations offer exact reconstruction,
...
1
2
3
4
5
...