Approximating the Kullback Leibler Divergence Between Gaussian Mixture Models

@article{Hershey2007ApproximatingTK,
  title={Approximating the Kullback Leibler Divergence Between Gaussian Mixture Models},
  author={J. Hershey and P. Olsen},
  journal={2007 IEEE International Conference on Acoustics, Speech and Signal Processing - ICASSP '07},
  year={2007},
  volume={4},
  pages={IV-317-IV-320}
}
  • J. Hershey, P. Olsen
  • Published 2007
  • Mathematics, Computer Science
  • 2007 IEEE International Conference on Acoustics, Speech and Signal Processing - ICASSP '07
The Kullback Leibler (KL) divergence is a widely used tool in statistics and pattern recognition. The KL divergence between two Gaussian mixture models (GMMs) is frequently needed in the fields of speech and image recognition. Unfortunately the KL divergence between two GMMs is not analytically tractable, nor does any efficient computational algorithm exist. Some techniques cope with this problem by replacing the KL divergence with other functions that can be computed efficiently. We introduce… Expand
Variational Kullback-Leibler divergence for Hidden Markov models
TLDR
Two variational approximations are introduced to efficiently compute the KL divergence and Bhattacharyya divergence between two mixture models, by reducing them to the divergences between the mixture components. Expand
Lower and upper bounds for approximation of the Kullback-Leibler divergence between Gaussian Mixture Models
TLDR
Lower and upper bounds for the KL divergence are proposed, which lead to a new approximation and interesting insights into previously proposed approximations, which are used to validate assumptions on the models. Expand
Accelerated Monte Carlo for Kullback-Leibler divergence between Gaussian mixture models
TLDR
This work shows how to accelerate Monte-Carlo sampling using variational approximations of the KL divergence using control variates and importance sampling, and can achieve improvements in accuracy equivalent to using a factor of 30 times more samples. Expand
Comparison of approximation methods to Kullback–Leibler divergence between Gaussian mixture models for satellite image retrieval
ABSTRACT As a probabilistic distance between two probability density functions, Kullback–Leibler divergence is widely used in many applications, such as image retrieval and change detection.Expand
Comparison of Kullback-Leibler divergence approximation methods between Gaussian mixture models for satellite image retrieval
  • S. Cui, M. Datcu
  • Mathematics, Computer Science
  • 2015 IEEE International Geoscience and Remote Sensing Symposium (IGARSS)
  • 2015
TLDR
Seven methods are compared for approximating the Kullback-Leibler divergence between two Gaussian mixture models for satellite image retrieval, namely Monte Carlo method, matched bond approximation, product of Gaussian, variation-al method, unscented transformation, Gaussian approximation, and min-Gaussian approximation. Expand
Closed-form information-theoretic divergences for statistical mixtures
  • F. Nielsen
  • Computer Science, Mathematics
  • Proceedings of the 21st International Conference on Pattern Recognition (ICPR2012)
  • 2012
TLDR
This paper state sufficient conditions on the mixture distribution family so that these novel non-KL statistical divergences between any two such mixtures can be expressed in generic closed-form formulas. Expand
Upper and lower bounds for approximation of the Kullback-Leibler divergence between Hidden Markov models
TLDR
Two novel methods for approximating the KL divergence between the left-to-right transient HMMs are proposed, one a product approximation which can be calculated recursively without introducing extra parameters and the other based on the upper and lower bounds of KL divergence. Expand
Comix: Joint estimation and lightspeed comparison of mixture models
TLDR
A sub-class of the mixture models where the component parameters are shared between a set of mixtures and the only degree-of-freedom is the vector of weights of each mixture allows to design extremely fast versions of existing dissimilarity measures between mixtures. Expand
Texture Retrieval Using Cauchy-Schwarz Divergence and Generalized Gaussian Mixtures
TLDR
This paper introduces the Cauchy-Schwarz divergence (CSD) in the context of texture retrieval, and proposes the CSD as a similarity measure between two MoGGs. Expand
Consistency issues in Gaussian Mixture Models reduction algorithms
TLDR
The importance of the choice of the dissimilarity measure and the issue of consistency of all steps of a reduction algorithm with the chosen measure are discussed and compared. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 25 REFERENCES
An efficient image similarity measure based on approximations of KL-divergence between two gaussian mixtures
TLDR
Two new methods for approximating the Kullback-Liebler (KL) divergence between two mixtures of Gaussians based on matching between the Gaussian elements of the two Gaussian mixture densities are presented. Expand
Fast approximation of Kullback-Leibler distance for dependence trees and hidden Markov models
  • M. Do
  • Mathematics, Computer Science
  • IEEE Signal Processing Letters
  • 2003
TLDR
A fast algorithm to approximate the Kullback-Leibler distance (KLD) between two dependence tree models is presented, which offers a saving of hundreds of times in computational complexity compared to the commonly used Monte Carlo method. Expand
Average divergence distance as a statistical discrimination measure for hidden Markov models
TLDR
The notion of average divergence distance (ADD) is proposed as a statistical discrimination measure between two HMMs, considering the transient behavior of these models, and it is shown that ADD provides a coherent way to evaluate the discrimination dissimilarity between acoustic models. Expand
Theory and practice of acoustic confusability
TLDR
This paper defines two alternatives to the familiar perplexity statistic, respectively acoustic perplexity and the synthetic acoustic word error rate, and shows how to compute these statistics by effectively synthesizing a large acoustic corpus. Expand
An Introduction to Variational Methods for Graphical Models
TLDR
This paper presents a tutorial introduction to the use of variational methods for inference and learning in graphical models (Bayesian networks and Markov random fields), and describes a general framework for generating variational transformations based on convex duality. Expand
An efficient integrated gender detection scheme and time mediated averaging of gender dependent acoustic models
TLDR
This paper proposes how to discover which phonemes are inherently similar for male and female speakers and how to efficiently share this information between gender dependent GMMs and a highly accurate and computationally efficient gender detection scheme is suggested. Expand
Computation of channel capacity and rate-distortion functions
  • R. Blahut
  • Mathematics, Computer Science
  • IEEE Trans. Inf. Theory
  • 1972
TLDR
A simple algorithm for computing channel capacity is suggested that consists of a mapping from the set of channel input probability vectors into itself such that the sequence of probability vectors generated by successive applications of the mapping converges to the vector that achieves the capacity of the given channel. Expand
A General Method for Approximating Nonlinear Transformations of Probability Distributions
In this paper we describe a new approach for generalised nonlinear ltering. We show that the technique is more accurate, more stable, and far easier to implement than an extended Kalman lter. SeveralExpand
An algorithm for computing the capacity of arbitrary discrete memoryless channels
  • S. Arimoto
  • Mathematics, Computer Science
  • IEEE Trans. Inf. Theory
  • 1972
TLDR
A systematic and iterative method of computing the capacity of arbitrary discrete memoryless channels is presented and a few inequalities that give upper and lower bounds on the capacity are derived. Expand
Elements of Information Theory
TLDR
The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment. Expand
...
1
2
3
...