K-SVD and its non-negative variant for dictionary design

@inproceedings{Aharon2005KSVDAI,
  title={K-SVD and its non-negative variant for dictionary design},
  author={Michal Aharon and Michael Elad and Alfred Marcel Bruckstein},
  booktitle={SPIE Optics + Photonics},
  year={2005}
}
In recent years there is a growing interest in the study of sparse representation for signals. Using an overcomplete dictionary that contains prototype signal-atoms, signals are described as sparse linear combinations of these atoms. Recent activity in this field concentrated mainly on the study of pursuit algorithms that decompose signals with respect to a given dictionary. Designing dictionaries to better fit the above model can be done by either selecting pre-specified transforms, or by… Expand
A strategy of classification via sparse dictionary learned by non-negative K-SVD
TLDR
This model first applies the non-negative K-SVD algorithm to learning the discriminative dictionaries using very few training samples and then represents a test image as a linear combination of atoms from these learned dictionaries based on thenon-negative variation of Basis Pursuit. Expand
Nonnegative Low-rank Sparse Component Analysis
TLDR
Two algorithms to train K-NMF are proposed: one based on alternating optimization and exact sparse coding, the other based on a nonnegative variant of K-subspace, equivalent to k-sparse nonnegative matrix factorization. Expand
Dictionary Learning Based on Nonnegative Matrix Factorization Using Parallel Coordinate Descent
Sparse representation of signals via an overcomplete dictionary has recently received much attention as it has produced promising results in various applications. Since the nonnegativities of theExpand
Learning overcomplete dictionaries with ℓ0-sparse Non-negative Matrix Factorisation
TLDR
Dictionary recovery experiments using overcomplete dictionaries show that the use of a ℓ0 norm penalty for NMF outperforms both NMF and a state of the art S-NMF method, in particular when the dictionary to be learnt is dense. Expand
Denoising predictive sparse decomposition
  • Long Qian, Xingjian Shi
  • Computer Science
  • 2014 International Conference on Big Data and Smart Computing (BIGCOMP)
  • 2014
TLDR
A novel model, called denoising PSD (DPSD), for robust sparse coding is proposed and experiments on real visual object recognition tasks show that DPSD can dramatically outperform PSD in real applications. Expand
Provable Dictionary Learning via Column Signatures
In dictionary learning, also known as sparse coding, we are given samples of the form y = Ax where x ∈ R is an unknown random sparse vector and A is an unknown dictionary matrix in Rn×m (usually m >Expand
Nonnegative Sparse Representation of Signals With a Determinant-Type Sparsity Measure Based on the DC Programming
TLDR
This paper designs an effective and tailored algorithm for the sparse representation of nonnegative signals using the so-called determinant sparsity measure formed with the determinant of the Gram matrix of sparse coefficients and forms the nonnegative dictionary learning problem that is optimization of a non-convex function. Expand
Dictionary learning based on dip patch selection training for random noise attenuation
TLDR
This work has developed a dip-oriented dictionary learning method, which incorporates an estimation of the dip field into the selection procedure of training patches, and applies a curvelet-transform noise reduction method to remove some fine-scale components that presumably contain mostly random noise. Expand
Family of iterative LS-based dictionary learning algorithms, ILS-DLA, for sparse signal representation
TLDR
This work presents a family of iterative least squares based dictionary learning algorithms (ILS-DLA), including algorithms for design of signal dependent block based dictionaries and overlapping dictionaries, as generalizations of transforms and filter banks, respectively. Expand
Learning Big (Image) Data via Coresets for Dictionaries
TLDR
This paper suggests a new computational approach to the problem of dictionary learning, known in computational geometry as coresets, which are a small smart non-uniform sample from the input signals such that the quality of any given dictionary with respect to the input can be approximated via the coreset. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 27 REFERENCES
K-SVD : An Algorithm for Designing of Overcomplete Dictionaries for Sparse Representation
In recent years there has been a growing interest in the study of sparse representation of signals. Using an overcomplete dictionary that contains prototype signal-atoms, signals are described byExpand
$rm K$-SVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation
TLDR
A novel algorithm for adapting dictionaries in order to achieve sparse signal representations, the K-SVD algorithm, an iterative method that alternates between sparse coding of the examples based on the current dictionary and a process of updating the dictionary atoms to better fit the data. Expand
Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ1 minimization
  • D. Donoho, Michael Elad
  • Computer Science, Medicine
  • Proceedings of the National Academy of Sciences of the United States of America
  • 2003
TLDR
This article obtains parallel results in a more general setting, where the dictionary D can arise from two or several bases, frames, or even less structured systems, and sketches three applications: separating linear features from planar ones in 3D data, noncooperative multiuser encoding, and identification of over-complete independent component models. Expand
Dictionary Learning Algorithms for Sparse Representation
TLDR
Algorithms for data-driven learning of domain-specific overcomplete dictionaries are developed to obtain maximum likelihood and maximum a posteriori dictionary estimates based on the use of Bayesian models with concave/Schur-concave negative log priors, showing improved performance over other independent component analysis methods. Expand
Atomic Decomposition by Basis Pursuit
TLDR
Basis Pursuit (BP) is a principle for decomposing a signal into an "optimal" superposition of dictionary elements, where optimal means having the smallest l1 norm of coefficients among all such decompositions. Expand
Learning Overcomplete Representations
TLDR
It is shown that overcomplete bases can yield a better approximation of the underlying statistical distribution of the data and can thus lead to greater coding efficiency and provide a method for Bayesian reconstruction of signals in the presence of noise and for blind source separation when there are more sources than mixtures. Expand
Sparse image coding using learned overcomplete dictionaries
  • J. Murray, K. Kreutz-Delgado
  • Computer Science
  • Proceedings of the 2004 14th IEEE Signal Processing Society Workshop Machine Learning for Signal Processing, 2004.
  • 2004
TLDR
The authors' overcomplete dictionary learning algorithm (FOCUSS-CNDL) is compared with overcomplete independent component analysis (ICA) and a modified version of the FOCUSS algorithm is presented that can find a non-negative sparse coding in some cases. Expand
Greed is good: algorithmic results for sparse approximation
  • J. Tropp
  • Mathematics, Computer Science
  • IEEE Transactions on Information Theory
  • 2004
TLDR
This article presents new results on using a greedy algorithm, orthogonal matching pursuit (OMP), to solve the sparse approximation problem over redundant dictionaries and develops a sufficient condition under which OMP can identify atoms from an optimal approximation of a nonsparse signal. Expand
Learning the parts of objects by non-negative matrix factorization
TLDR
An algorithm for non-negative matrix factorization is demonstrated that is able to learn parts of faces and semantic features of text and is in contrast to other methods that learn holistic, not parts-based, representations. Expand
Non-negative Matrix Factorization with Sparseness Constraints
  • P. Hoyer
  • Computer Science, Mathematics
  • J. Mach. Learn. Res.
  • 2004
TLDR
This paper shows how explicitly incorporating the notion of 'sparseness' improves the found decompositions, and provides complete MATLAB code both for standard NMF and for an extension of this technique. Expand
...
1
2
3
...