Skip to search formSkip to main contentSkip to account menu

Kullback–Leibler divergence

Known as: Kl-divergence, KL-distance, Kullback divergence 
In probability theory and information theory, the Kullback–Leibler divergence, also called discrimination information (the name preferred by Kullback… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2013
2013
We derive a closed form solution for the Kullback-Leibler divergence between two Weibull distributions. These notes are meant as… 
2013
2013
Salient object detection is a computer vision technique that filters out redundant visual information and considers potentially… 
2011
2011
One of the main challenge in non-native speech recognition is how to handle acoustic variability present in multi-accented non… 
2010
2010
We consider model-based reinforcement learning in finite Markov Decision Processes (MDPs), focussing on so-called optimistic… 
2008
2008
In this paper, we define a similarity measure between images in the context of (indexing and) retrieval. We use the Kullback… 
2008
2008
Kullback Leibler (KL) divergence is widely used as a measure of dissimilarity between two probability distributions; however, the… 
2007
2007
Human gene expression can be regulated at four levels: differential gene transcription, selective nuclear RNA (nRNA) processing… 
2007
2007
Divergence measures are widely used tools in statistics and pattern recognition. The Kullback-Leibler (KL) divergence between two… 
Highly Cited
1995
Highly Cited
1995
Information measures with respect to spatial locations and scales of objects in an image are important to image processing and…