Skip to search formSkip to main contentSkip to account menu

Kullback–Leibler divergence

Known as: Kl-divergence, KL-distance, Kullback divergence 
In probability theory and information theory, the Kullback–Leibler divergence, also called discrimination information (the name preferred by Kullback… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2020
2020
We propose a method to fuse posterior distributions learned from heterogeneous datasets. Our algorithm relies on a mean field… 
Highly Cited
2019
Highly Cited
2019
In this paper, in order to improve the Student's t-matching accuracy, a novel Kullback-Leibler divergence (KLD) minimization… 
2019
2019
The Kullback–Leibler divergence (KLD) between two multivariate generalized Gaussian distributions (MGGDs) is a fundamental tool… 
2018
2018
Complex simulator-based models usually have intractable likelihood functions, rendering the likelihood-based inference methods… 
Highly Cited
2017
Highly Cited
2017
Two geometrical structures have been extensively studied for a manifold of probability distributions. One is based on the Fisher… 
Highly Cited
2015
Highly Cited
2015
This paper deals with the problem of fault detection and diagnosis of induction motor based on motor current signature analysis… 
Highly Cited
2014
Highly Cited
2014
The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies the proximity of two… 
Highly Cited
2010
Highly Cited
2010
We consider model-based reinforcement learning in finite Markov Decision Processes (MDPs), focussing on so-called optimistic… 
Highly Cited
2008
Highly Cited
2008
Clustering is a useful technique that organizes a large quantity of unordered text documents into a small number of meaningful… 
Highly Cited
2005
Highly Cited
2005
This paper presents a unifying view of messagepassing algorithms, as methods to approximate a complex Bayesian network by a…