Kullback–Leibler divergence

Known as: Kl-divergence, KL-distance, Kullback divergence 
In probability theory and information theory, the Kullback–Leibler divergence, also called discrimination information (the name preferred by Kullback… (More)
Wikipedia

Topic mentions per year

Topic mentions per year

1957-2018
020040060019572017

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Review
2014
Review
2014
Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon’s entropy, and comes up… (More)
  • figure 1
  • figure 2
  • figure 3
Is this relevant?
2012
2012
In this paper we study distributionally robust optimization (DRO) problems where the ambiguity set of the probability… (More)
  • table 1
  • table 2
Is this relevant?
2008
2008
We present a method for estimating the KL divergence between continuous densities and we prove it converges almost surely… (More)
  • figure 1
  • figure 2
  • figure 3
Is this relevant?
2008
2008
A method of detecting changes or anomalies in periodic information-carrying signals or any other sets of data using Kullback… (More)
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • figure 5
Is this relevant?
Highly Cited
2007
Highly Cited
2007
The Akaike information criterion (AIC) is a widely used tool for model selection. AIC is derived as an asymptotically unbiased… (More)
Is this relevant?
Highly Cited
2007
Highly Cited
2007
The Kullback Leibler (KL) divergence is a widely used tool in statistics and pattern recognition. The KL divergence between two… (More)
  • figure 1
  • figure 2
  • figure 3
Is this relevant?
Highly Cited
2004
Highly Cited
2004
In this work, we provide a computable expression for the Kullback-Leibler divergence rate lim/sub n/spl rarr//spl infin//1/nD(p… (More)
Is this relevant?
Highly Cited
2003
Highly Cited
2003
Over the last years significant efforts have been made to develop kernels that can be applied to sequence data such as DNA, text… (More)
  • table 1
  • table 2
  • table 3
  • figure 1
Is this relevant?
2002
2002
We propose a texture similarity measure based on the Kullback-Leibler divergence between gamma distributions (KLGamma). We… (More)
  • figure 1
  • figure 2
  • figure 3
  • table 1
  • table 2
Is this relevant?
Highly Cited
2000
Highly Cited
2000
We define a new distance measure the resistor-average distance between two probability distributions that is closely related to… (More)
  • figure 1
  • table 1
  • figure 2
Is this relevant?