Approximating the Kullback Leibler Divergence Between Gaussian Mixture Models

@article{Hershey2007ApproximatingTK,
  title={Approximating the Kullback Leibler Divergence Between Gaussian Mixture Models},
  author={John R. Hershey and Peder A. Olsen},
  journal={2007 IEEE International Conference on Acoustics, Speech and Signal Processing - ICASSP '07},
  year={2007},
  volume={4},
  pages={IV-317-IV-320}
}
The Kullback Leibler (KL) divergence is a widely used tool in statistics and pattern recognition. The KL divergence between two Gaussian mixture models (GMMs) is frequently needed in the fields of speech and image recognition. Unfortunately the KL divergence between two GMMs is not analytically tractable, nor does any efficient computational algorithm exist. Some techniques cope with this problem by replacing the KL divergence with other functions that can be computed efficiently. We introduce… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 440 CITATIONS, ESTIMATED 24% COVERAGE

Security in Voice Authentication

VIEW 5 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Information theoretic novelty detection

  • Pattern Recognition
  • 2010
VIEW 17 EXCERPTS
CITES METHODS, RESULTS & BACKGROUND
HIGHLY INFLUENCED

Variational Memory Encoder-Decoder

VIEW 10 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

A combination of Gaussian Mixture Model and Support Vector Machine for speaker verification

  • 2017 IEEE International Symposium on Medical Measurements and Applications (MeMeA)
  • 2017
VIEW 5 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Database and Expert Systems Applications

  • Lecture Notes in Computer Science
  • 2017
VIEW 8 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2005
2019

CITATION STATISTICS

  • 66 Highly Influenced Citations

  • Averaged 45 Citations per year over the last 3 years

  • 2% Increase in citations per year in 2018 over 2017

References

Publications referenced by this paper.
SHOWING 1-10 OF 15 REFERENCES

Fast approximation ofKullback-Leibler distance for dependence trees and hidden Markov models

Minh N. Do
  • IEEE Signal Processing Letters, vol. 10, no. 4, pp. 115-118, April 2003.
  • 2003

Uhlmann , " A general method for approximating nonlinear transformations of probability distributions , " Tech

Simon Julier, K. Jeffrey
  • 1996

Similar Papers

Loading similar papers…