A Generalized Divergence Measure for Nonnegative Matrix Factorization

@article{Kompass2007AGD,
  title={A Generalized Divergence Measure for Nonnegative Matrix Factorization},
  author={Raul Kompass},
  journal={Neural Computation},
  year={2007},
  volume={19},
  pages={780-791}
}
  • R. Kompass
  • Published 1 March 2007
  • Computer Science
  • Neural Computation
This letter presents a general parametric divergence measure. The metric includes as special cases quadratic error and Kullback-Leibler divergence. A parametric generalization of the two different multiplicative update rules for nonnegative matrix factorization by Lee and Seung (2001) is shown to lead to locally optimal solutions of the nonnegative matrix factorization problem with this new cost function. Numeric simulations demonstrate that the new update rule may improve the quadratic… 
Convergence-guaranteed multiplicative algorithms for nonnegative matrix factorization with β-divergence
TLDR
This paper presents a new multiplicative algorithm for nonnegative matrix factorization with β-divergence, theoretically proven for any real-valued β based on the auxiliary function method.
A unified global convergence analysis of multiplicative update rules for nonnegative matrix factorization
TLDR
This paper provides a sufficient condition for a general multiplicative update rule to have the global convergence property in the sense that any sequence of solutions has at least one convergent subsequence and the limit of any convergence subsequence is a stationary point of the optimization problem.
Algorithms for Nonnegative Matrix Factorization with the β-Divergence
This letter describes algorithms for nonnegative matrix factorization (NMF) with the β-divergence (β-NMF). The β-divergence is a family of cost functions parameterized by a single shape parameter β
A family of modified projective Nonnegative Matrix Factorization algorithms
  • Zhijian Yuan, E. Oja
  • Computer Science
    2007 9th International Symposium on Signal Processing and Its Applications
  • 2007
TLDR
Experimental results show that versions of P-NMF derive bases which are somewhat better suitable for a localized and sparse representation than NMF, as well as being more orthogonal.
Algorithms for nonnegative matrix factorization with the beta-divergence
This paper describes algorithms for nonnegative matrix factorization (NMF) with the beta-divergence (beta-NMF). The beta-divergence is a family of cost functions parametrized by a single shape
Unified Development of Multiplicative Algorithms for Linear and Quadratic Nonnegative Matrix Factorization
TLDR
A general approach to derive the auxiliary function for a wide variety of NMF problems, as long as the approximation objective can be expressed as a finite sum of monomials with real exponents is proposed.
Stability analysis of multiplicative update algorithms for non-negative matrix factorization
TLDR
It is shown that Lyapunov's stability theory provides a very enlightening viewpoint on the problem of NMF multiplicative update and is proved the stability of supervised NMF and study the more difficult case of unsupervised NMF.
Quadratic nonnegative matrix factorization
Stability Analysis of Multiplicative Update Algorithms and Application to Nonnegative Matrix Factorization
TLDR
It is shown that Lyapunov's stability theory provides a very enlightening viewpoint on the problem of NMF multiplicative update, and the exponential or asymptotic stability of the solutions to general optimization problems with nonnegative constraints is proved.
Nonnegative matrix factorizations as probabilistic inference in composite models
TLDR
This paper describes multiplicative, Expectation-Maximization, Markov chain Monte Carlo and Variational Bayes algorithms for the NMF problem, and aims at providing statistical insights to NMF.
...
1
2
3
4
5
...

References

SHOWING 1-9 OF 9 REFERENCES
Algorithms for Non-negative Matrix Factorization
TLDR
Two different multiplicative algorithms for non-negative matrix factorization are analyzed and one algorithm can be shown to minimize the conventional least squares error while the other minimizes the generalized Kullback-Leibler divergence.
Learning the parts of objects by non-negative matrix factorization
TLDR
An algorithm for non-negative matrix factorization is demonstrated that is able to learn parts of faces and semantic features of text and is in contrast to other methods that learn holistic, not parts-based, representations.
Neural Networks, Principal Components, and Subspaces
  • E. Oja
  • Computer Science
    Int. J. Neural Syst.
  • 1989
TLDR
A single neuron with Hebbian-type learning for the connection weights, and with nonlinear internal feedback, has been shown to extract the statistical principal components of its stationary input pattern sequence, which yields a multi-dimensional, principal component subspace.
Backpropagation Applied to Handwritten Zip Code Recognition
TLDR
This paper demonstrates how constraints from the task domain can be integrated into a backpropagation network through the architecture of the network, successfully applied to the recognition of handwritten zip code digits provided by the U.S. Postal Service.
Preintegration Lateral Inhibition Enhances Unsupervised Learning
TLDR
It is argued that preintegration lateral inhibition has computational advantages over conventional neural network architectures while remaining equally biologically plausible.
Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper
Vibratory power unit for vibrating conveyers and screens comprising an asynchronous polyphase motor, at least one pair of associated unbalanced masses disposed on the shaft of said motor, with the
CBCL face database #1
  • Available online at http://cbcl.mit.edu/cbcl/software-datasets/facedata2.html.
  • 2000