Skip to search formSkip to main content
You are currently offline. Some features of the site may not work correctly.

Kullback–Leibler divergence

Known as: Kl-divergence, KL-distance, Kullback divergence 
In probability theory and information theory, the Kullback–Leibler divergence, also called discrimination information (the name preferred by Kullback… Expand
Wikipedia

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2011
Highly Cited
2011
This letter describes algorithms for nonnegative matrix factorization (NMF) with the β-divergence (β-NMF). The β-divergence is a… Expand
Is this relevant?
Highly Cited
2008
Highly Cited
2008
We present a method for estimating the KL divergence between continuous densities and we prove it converges almost surely… Expand
  • figure 1
  • figure 2
  • figure 3
Is this relevant?
Highly Cited
2007
Highly Cited
2007
The Kullback Leibler (KL) divergence is a widely used tool in statistics and pattern recognition. The KL divergence between two… Expand
  • figure 1
  • figure 2
  • figure 3
Is this relevant?
Highly Cited
2005
Highly Cited
2005
This paper presents a unifying view of messagepassing algorithms, as methods to approximate a complex Bayesian network by a… Expand
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • figure 5
Is this relevant?
Highly Cited
2004
Highly Cited
2004
A wide variety of distortion functions, such as squared Euclidean distance, Mahalanobis distance, Itakura-Saito distance and… Expand
  • table 1
  • table 2
  • figure 1
  • table 3
  • table 4
Is this relevant?
Highly Cited
2003
Highly Cited
2003
Over the last years significant efforts have been made to develop kernels that can be applied to sequence data such as DNA, text… Expand
  • table 1
  • table 2
  • table 3
  • figure 1
Is this relevant?
Highly Cited
2003
Highly Cited
2003
We introduce a Kullback-Leibler (1968) -type distance between spectral density functions of stationary stochastic processes and… Expand
Is this relevant?
Highly Cited
2001
Highly Cited
2001
We describe an information-theoretic paradigm for analysis of ecological data, based on Kullback–Leibler information, that is an… Expand
  • figure 1
  • table 1
Is this relevant?
Highly Cited
2001
Highly Cited
2001
We define a new distance measure the resistor-average distance between two probability distributions that is closely related to… Expand
  • figure 1
  • table 1
  • figure 2
Is this relevant?
Highly Cited
1991
Highly Cited
1991
  • Jianhua Lin
  • IEEE Trans. Inf. Theory
  • 1991
  • Corpus ID: 12121632
A novel class of information-theoretic divergence measures based on the Shannon entropy is introduced. Unlike the well-known… Expand
  • figure 1
  • figure 2
Is this relevant?