New Algorithms for Non-Negative Matrix Factorization in Applications to Blind Source Separation


In this paper we develop several algorithms for non-negative matrix factorization (NMF) in applications to blind (or semi blind) source separation (BSS), when sources are generally statistically dependent under conditions that additional constraints are imposed such as nonnegativity, sparsity, smoothness, lower complexity or better predictability. We express the non-negativity constraints using a wide class of loss (cost) functions, which leads to an extended class of multiplicative algorithms with regularization. The proposed relaxed forms of the NMF algorithms have a higher convergence speed with the desired constraints. Moreover, the effects of various regularization and constraints are clearly shown. The scope of the results is vast since the discussed loss functions include quite a large number of useful cost functions such as weighted Euclidean distance, relative entropy, Kullback Leibler divergence, and generalized Hellinger, Pearson's, Neyman's distances, etc

DOI: 10.1109/ICASSP.2006.1661352

Extracted Key Phrases

Showing 1-8 of 8 references

Goodness-of-Fit Statistics for Discrete Multivariate Data

  • N A Cressie, T C R Read
  • 1988

Differential-Geometrical Methods in Statistics

  • S Amari
  • 1985
1 Excerpt
Showing 1-10 of 102 extracted citations


Citations per Year

225 Citations

Semantic Scholar estimates that this publication has received between 152 and 328 citations based on the available data.

See our FAQ for additional information.