Projected Gradient Methods for Nonnegative Matrix Factorization

@article{Lin2007ProjectedGM,
  title={Projected Gradient Methods for Nonnegative Matrix Factorization},
  author={Chih-Jen Lin},
  journal={Neural Computation},
  year={2007},
  volume={19},
  pages={2756-2779}
}
  • Chih-Jen Lin
  • Published 1 October 2007
  • Computer Science
  • Neural Computation
Nonnegative matrix factorization (NMF) can be formulated as a minimization problem with bound constraints. Although bound-constrained optimization has been studied extensively in both theory and practice, so far no study has formally applied its techniques to NMF. In this letter, we propose two projected gradient methods for NMF, both of which exhibit strong optimization properties. We discuss efficient implementations and demonstrate that one of the proposed methods converges faster than the… 
Sparse nonnegative matrix factorization with ℓ0-constraints
Stochastic Variance Reduced Multiplicative Update for Nonnegative Matrix Factorization
  • Hiroyuki Kasai
  • Computer Science
    2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2018
TLDR
Numerical comparisons suggest that the proposed algorithms robustly outperform state-of-the-art algorithms across different synthetic and real-world datasets.
Efficient Nonnegative Matrix Factorization via projected Newton method
Nonnegative matrix factorization using ADMM: Algorithm and convergence analysis
TLDR
This work settles the convergence issue of a popular algorithm based on the alternating direction method of multipliers proposed in Boyd et al 2011, and shows that the algorithm converges globally to the set of KKT solutions whenever certain penalty parameter ρ satisfies ρ > 1.
A Nonnegative Projection Based Algorithm for Low-Rank Nonnegative Matrix Approximation
TLDR
This paper proposes a nonnegative projection based NMF algorithm, which is different from the conventional multiplicative update NMF algorithms and decreases the objective function by performing Procrustes rotation and non negative projection alternately.
Accelerating Nonnegative Matrix Factorization Algorithms Using Extrapolation
We propose a general framework to accelerate significantly the algorithms for nonnegative matrix factorization (NMF). This framework is inspired from the extrapolation scheme used to accelerate
Fast Nonnegative Matrix Factorization Algorithms Using Projected Gradient Approaches for Large-Scale Problems
TLDR
This paper investigates and test some recent PG methods in the context of their applicability to NMF, and focuses on the following modified methods: projected Landweber, Barzilai-Borwein gradient projection, projected sequential subspace optimization, interior-point Newton (IPN), and sequential coordinate-wise.
Sequential Sparse NMF
TLDR
This work considers a sparsity measure linear in the ratio of the L⁁ and L₂ norms, and proposes an efficient algorithm to handle the norm constraints which arise when optimizing this measure.
Nonnegative matrix factorization using projected gradient algorithms with sparseness constraints
  • N. Mohammadiha, A. Leijon
  • Computer Science
    2009 IEEE International Symposium on Signal Processing and Information Technology (ISSPIT)
  • 2009
TLDR
The efficiency and execution time of five different PG algorithms and the basic multiplicative algorithm for NMF are compared and the resulted factorizations are used for a hand-written digit classifier.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 46 REFERENCES
Non-negative Matrix Factorization with Sparseness Constraints
  • P. Hoyer
  • Computer Science
    J. Mach. Learn. Res.
  • 2004
TLDR
This paper shows how explicitly incorporating the notion of 'sparseness' improves the found decompositions, and provides complete MATLAB code both for standard NMF and for an extension of this technique.
Fast Nonnegative Matrix Factorization Algorithms Using Projected Gradient Approaches for Large-Scale Problems
TLDR
This paper investigates and test some recent PG methods in the context of their applicability to NMF, and focuses on the following modified methods: projected Landweber, Barzilai-Borwein gradient projection, projected sequential subspace optimization, interior-point Newton (IPN), and sequential coordinate-wise.
On the Convergence of Multiplicative Update Algorithms for Nonnegative Matrix Factorization
  • Chih-Jen Lin
  • Computer Science, Mathematics
    IEEE Transactions on Neural Networks
  • 2007
TLDR
This paper proposes slight modifications of existing updates and proves their convergence, and techniques invented in this paper may be applied to prove the convergence for other bound-constrained optimization problems.
Multilayer Nonnegative Matrix Factorization Using Projected Gradient Approaches
TLDR
This paper presents and compares the performance of additive algorithms based on three different variations of a projected gradient approach, and demonstrates that this approach (the multilayer system with projected gradient algorithms) can usually give much better performance than standard multiplicative algorithms, especially, if data are ill-conditioned, badly-scaled, and/or a number of observations is only slightly greater than aNumber of nonnegative hidden components.
Algorithms for Non-negative Matrix Factorization
TLDR
Two different multiplicative algorithms for non-negative matrix factorization are analyzed and one algorithm can be shown to minimize the conventional least squares error while the other minimizes the generalized Kullback-Leibler divergence.
Optimality, computation, and interpretation of nonnegative matrix factorizations
TLDR
The theoretical Kuhn-Tucker optimality condition is described in explicit form and a number of numerical techniques, old and new, are suggested for the nonnegative matrix factorization problems.
Accelerating the Lee-Seung Algorithm for Nonnegative Matrix Factorization
TLDR
A variation of one of the Lee-Seung algorithms with a notably improved performance is presented and it is shown that algorithms of this type do not necessarily converge to local minima.
Nonnegative Matrix and Tensor Factorization [Lecture Notes]
TLDR
Multiplicative algorithms are not necessary the best approaches for NMF, especially if data representations are not very redundant or sparse, and much better performance can be achieved using the FP-ALS, IPC, and QN methods.
Nonnegative Matrix Factorization with Gaussian Process Priors
TLDR
A general method for including prior knowledge in a nonnegative matrix factorization (NMF), based on Gaussian process priors, to find NMF decompositions that agree with prior knowledge of the distribution of the factors, such as sparseness, smoothness, and symmetries.
Computing non-negative tensor factorizations
TLDR
An approach for computing the NTF of a dataset that relies only on iterative linear-algebra techniques and that is comparable in cost to the non-negative matrix factorization (NMF) is described.
...
1
2
3
4
5
...