Skip to search formSkip to main contentSkip to account menu

Kullback–Leibler divergence

Known as: Kl-divergence, KL-distance, Kullback divergence 
In probability theory and information theory, the Kullback–Leibler divergence, also called discrimination information (the name preferred by Kullback… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2017
2017
Lattice-free maximum mutual information (LFMMI) was recently proposed as a mixture of the ideas of hidden-Markov-model-based… 
2013
2013
We derive a closed form solution for the Kullback-Leibler divergence between two Weibull distributions. These notes are meant as… 
2011
2011
One of the main challenge in non-native speech recognition is how to handle acoustic variability present in multi-accented non… 
2010
2010
We consider model-based reinforcement learning in finite Markov Decision Processes (MDPs), focussing on so-called optimistic… 
2009
2009
Performance of wavelet thresholding methods for speech enhancement is dependent on estimating an exact threshold value in the… 
2008
2008
Kullback Leibler (KL) divergence is widely used as a measure of dissimilarity between two probability distributions; however, the… 
2007
2007
The Kullback-Leibler (K-L) divergence rate is a natural extension of the familiar K-L divergence between probability vectors, to… 
2006
2006
We introduce a new EM framework in which it is possible not only to optimize the model parameters but also the number of model… 
Highly Cited
1995
Highly Cited
1995
Information measures with respect to spatial locations and scales of objects in an image are important to image processing and…