Approximating the Kullback Leibler Divergence Between Gaussian Mixture Models

@article{Hershey2007ApproximatingTK,
  title={Approximating the Kullback Leibler Divergence Between Gaussian Mixture Models},
  author={John R. Hershey and Peder A. Olsen},
  journal={2007 IEEE International Conference on Acoustics, Speech and Signal Processing - ICASSP '07},
  year={2007},
  volume={4},
  pages={IV-317-IV-320}
}
The Kullback Leibler (KL) divergence is a widely used tool in statistics and pattern recognition. The KL divergence between two Gaussian mixture models (GMMs) is frequently needed in the fields of speech and image recognition. Unfortunately the KL divergence between two GMMs is not analytically tractable, nor does any efficient computational algorithm exist. Some techniques cope with this problem by replacing the KL divergence with other functions that can be computed efficiently. We introduce… CONTINUE READING
Highly Influential
This paper has highly influenced 52 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 619 citations. REVIEW CITATIONS

Citations

Publications citing this paper.
Showing 1-10 of 368 extracted citations

Security in Voice Authentication

View 5 Excerpts
Highly Influenced

Information theoretic novelty detection

Pattern Recognition • 2010
View 17 Excerpts
Highly Influenced

A combination of Gaussian Mixture Model and Support Vector Machine for speaker verification

2017 IEEE International Symposium on Medical Measurements and Applications (MeMeA) • 2017
View 5 Excerpts
Highly Influenced

Database and Expert Systems Applications

Lecture Notes in Computer Science • 2017
View 8 Excerpts
Highly Influenced

Deep neural network-guided unit selection synthesis

2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) • 2016
View 5 Excerpts
Highly Influenced

Jointly Informative Feature Selection Made Tractable by Gaussian Modeling

Journal of Machine Learning Research • 2016
View 9 Excerpts
Highly Influenced

Gaussian Mixture Models Reduction by Variational Maximum Mutual Information

IEEE Transactions on Signal Processing • 2015
View 4 Excerpts
Highly Influenced

Accounting for Price Dependencies in Simultaneous Sealed-Bid Auctions

AAAI Workshop: Trading Agent Design and Analysis • 2013
View 4 Excerpts
Highly Influenced

620 Citations

0204060'09'12'15'18
Citations per Year
Semantic Scholar estimates that this publication has 620 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
Showing 1-10 of 15 references

Average divergence distance as a statistical discrimination measure for hidden Markov models

IEEE Transactions on Audio, Speech, and Language Processing • 2006
View 1 Excerpt

Do , " Fast approximation ofKullbackLeibler distance for dependence trees and hidden Markov models

N. Minh
IEEE Signal Processing Letters • 2003

Theory and practice of acoustic confusability

Computer Speech & Language • 2002
View 1 Excerpt

Uhlmann , " A general method for approximating nonlinear transformations of probability distributions , " Tech

Simon Julier, K. Jeffrey
1996

Similar Papers

Loading similar papers…