Nonnegative Matrix Factorization with the Itakura-Saito Divergence: With Application to Music Analysis

@article{Fvotte2009NonnegativeMF,
  title={Nonnegative Matrix Factorization with the Itakura-Saito Divergence: With Application to Music Analysis},
  author={C{\'e}dric F{\'e}votte and Nancy Bertin and Jean-Louis Durrieu},
  journal={Neural Computation},
  year={2009},
  volume={21},
  pages={793-830}
}
This letter presents theoretical, algorithmic, and experimental results about nonnegative matrix factorization (NMF) with the Itakura-Saito (IS) divergence. We describe how IS-NMF is underlaid by a well-defined statistical model of superimposed gaussian components and is equivalent to maximum likelihood estimation of variance parameters. This setting can accommodate regularization constraints on the factors through Bayesian priors. In particular, inverse-gamma and gamma Markov chain priors are… 
Towards Complex Nonnegative Matrix Factorization with the Beta-Divergence
  • P. Magron, T. Virtanen
  • Computer Science
    2018 16th International Workshop on Acoustic Signal Enhancement (IWAENC)
  • 2018
TLDR
This paper introduces the beta-divergence in a heuristic fashion within a phase-aware probabilistic model, and demonstrates its potential for an audio source separation task, where it outperforms previous complex NMFs approaches.
A tempering approach for Itakura-Saito non-negative matrix factorization. With application to music transcription
TLDR
The aim of this paper is to propose a tempering scheme that favors convergence of IS-NMF to global minima, based on NMF with the beta-divergence, where the shape parameter beta acts as a temperature parameter.
Algorithms for nonnegative matrix factorization with the beta-divergence
This paper describes algorithms for nonnegative matrix factorization (NMF) with the beta-divergence (beta-NMF). The beta-divergence is a family of cost functions parametrized by a single shape
Enforcing Harmonicity and Smoothness in Bayesian Non-Negative Matrix Factorization Applied to Polyphonic Music Transcription
TLDR
Bayesian NMF with harmonicity and temporal continuity constraints is shown to outperform other standard NMF-based transcription systems, providing a meaningful mid-level representation of the data.
Algorithms for Nonnegative Matrix Factorization with the β-Divergence
This letter describes algorithms for nonnegative matrix factorization (NMF) with the β-divergence (β-NMF). The β-divergence is a family of cost functions parameterized by a single shape parameter β
Itakura-Saito Nonnegative Factorizations of the Power Spectrogram for Music Signal Decomposition
TLDR
This chapter gives evidence of the relevance of considering factorization of the power spectrogram, with the Itakura-Saito (IS) divergence, and discusses extensions of NMF to the multichannel case, in both instantaneous or convolutive recordings, possibly underdetermined.
Automatic Relevance Determination in Nonnegative Matrix Factorization with the-Divergence
TLDR
A Bayesian model based on automatic relevance determination (ARD) in which the columns of the dictionary matrix and the rows of the activation matrix are tied together through a common scale parameter in their prior is proposed.
Nonnegative matrix factorizations as probabilistic inference in composite models
TLDR
This paper describes multiplicative, Expectation-Maximization, Markov chain Monte Carlo and Variational Bayes algorithms for the NMF problem, and aims at providing statistical insights to NMF.
Kullback-Leibler Divergence for Nonnegative Matrix Factorization
TLDR
It is shown that using KL-divergence takes the normalization structure into account in a very natural way and brings improvements for nonnegative matrix factorizations: the gradients of the normalized KL-Divergence are well-scaled and thus lead to a new projected gradient method for NMF which runs faster or yields better approximation than three other widely used NMF algorithms.
Automatic Relevance Determination in Nonnegative Matrix Factorization with the /spl beta/-Divergence
  • V. Tan, C. Févotte
  • Computer Science
    IEEE Transactions on Pattern Analysis and Machine Intelligence
  • 2013
TLDR
A Bayesian model based on automatic relevance determination (ARD) in which the columns of the dictionary matrix and the rows of the activation matrix are tied together through a common scale parameter in their prior is proposed.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 61 REFERENCES
Algorithms for Nonnegative Matrix Factorization with the β-Divergence
This letter describes algorithms for nonnegative matrix factorization (NMF) with the β-divergence (β-NMF). The β-divergence is a family of cost functions parameterized by a single shape parameter β
Csiszár's Divergences for Non-negative Matrix Factorization: Family of New Algorithms
TLDR
A wide class of loss (cost) functions for non-negative matrix factorization (NMF) are discus and several novel algorithms with improved efficiency and robustness to noise and outliers are derived and applied to blind (or semi blind) source separation.
Generalized Nonnegative Matrix Approximations with Bregman Divergences
TLDR
This paper makes algorithmic progress by modeling and solving (using multiplicative updates) new generalized NNMA problems that minimize Bregman divergences between the input matrix and its low-rank approximation.
Extended SMART Algorithms for Non-negative Matrix Factorization
TLDR
A family of new extended SMART (Simultaneous Multiplicative Algebraic Reconstruction Technique) algorithms for Non-negative Matrix Factorization (NMF) are derived by improved efficiency and convergence rate and can be applied for various distributions of data and additive noise.
Nonnegative matrix factorization with constrained second-order optimization
Multichannel Nonnegative Matrix Factorization in Convolutive Mixtures for Audio Source Separation
  • A. Ozerov, C. Févotte
  • Computer Science
    IEEE Transactions on Audio, Speech, and Language Processing
  • 2010
We consider inference in a general data-driven object-based model of multichannel audio data, assumed generated as a possibly underdetermined convolutive mixture of source signals. We work in the
Algorithms for Non-negative Matrix Factorization
TLDR
Two different multiplicative algorithms for non-negative matrix factorization are analyzed and one algorithm can be shown to minimize the conventional least squares error while the other minimizes the generalized Kullback-Leibler divergence.
Orthogonal nonnegative learning for sparse feature extraction and approximate combinatorial optimization
TLDR
It is shown how the multiplicative updates rules obtained by using the proposed ONL principle can find a nonnegative and highly orthogonal matrix for an approximated graph partitioning problem.
A Generalized Divergence Measure for Nonnegative Matrix Factorization
TLDR
A parametric generalization of the two different multiplicative update rules for nonnegative matrix factorization by Lee and Seung (2001) is shown to lead to locally optimal solutions of the nonnegative Matrix factorization problem with this new cost function.
Space-alternating generalized expectation-maximization algorithm
TLDR
The paper describes the space-alternating generalized EM (SAGE) method, which updates the parameters sequentially by alternating between several small hidden-data spaces defined by the algorithm designer, and proves that the sequence of estimates monotonically increases the penalized-likelihood objective, derive asymptotic convergence rates, and provide sufficient conditions for monotone convergence in norm.
...
1
2
3
4
5
...