Skip to search form
Skip to main content
Skip to account menu
Semantic Scholar
Semantic Scholar's Logo
Search 218,109,815 papers from all fields of science
Search
Sign In
Create Free Account
Kullback–Leibler divergence
Known as:
Kl-divergence
, KL-distance
, Kullback divergence
Expand
In probability theory and information theory, the Kullback–Leibler divergence, also called discrimination information (the name preferred by Kullback…
Expand
Wikipedia
(opens in a new tab)
Create Alert
Alert
Related topics
Related topics
50 relations
Autoencoder
Bayesian information criterion
Biclustering
Boltzmann machine
Expand
Papers overview
Semantic Scholar uses AI to extract papers important to this topic.
2020
2020
Model Fusion with Kullback-Leibler Divergence
Sebastian Claici
,
M. Yurochkin
,
S. Ghosh
,
J. Solomon
International Conference on Machine Learning
2020
Corpus ID: 220496729
We propose a method to fuse posterior distributions learned from heterogeneous datasets. Our algorithm relies on a mean field…
Expand
Highly Cited
2019
Highly Cited
2019
A Novel Kullback–Leibler Divergence Minimization-Based Adaptive Student's t-Filter
Yulong Huang
,
Yonggang Zhang
,
J. Chambers
IEEE Transactions on Signal Processing
2019
Corpus ID: 203046124
In this paper, in order to improve the Student's t-matching accuracy, a novel Kullback-Leibler divergence (KLD) minimization…
Expand
2019
2019
Kullback–Leibler Divergence Between Multivariate Generalized Gaussian Distributions
N. Bouhlel
,
Ali Dziri
IEEE Signal Processing Letters
2019
Corpus ID: 165090190
The Kullback–Leibler divergence (KLD) between two multivariate generalized Gaussian distributions (MGGDs) is a fundamental tool…
Expand
2018
2018
Approximate Bayesian Computation with Kullback-Leibler Divergence as Data Discrepancy
Bai Jiang
International Conference on Artificial…
2018
Corpus ID: 4899389
Complex simulator-based models usually have intractable likelihood functions, rendering the likelihood-based inference methods…
Expand
Highly Cited
2017
Highly Cited
2017
Information geometry connecting Wasserstein distance and Kullback–Leibler divergence via the entropy-relaxed transportation problem
S. Amari
,
Ryo Karakida
,
Masafumi Oizumi
Information Geometry
2017
Corpus ID: 22432345
Two geometrical structures have been extensively studied for a manifold of probability distributions. One is based on the Fisher…
Expand
Highly Cited
2015
Highly Cited
2015
Electric Motor Fault Detection and Diagnosis by Kernel Density Estimation and Kullback–Leibler Divergence Based on Stator Current Measurements
A. Giantomassi
,
F. Ferracuti
,
S. Iarlori
,
G. Ippoliti
,
S. Longhi
IEEE transactions on industrial electronics…
2015
Corpus ID: 206705353
This paper deals with the problem of fault detection and diagnosis of induction motor based on motor current signature analysis…
Expand
Highly Cited
2014
Highly Cited
2014
Notes on Kullback-Leibler Divergence and Likelihood
Jonathon Shlens
arXiv.org
2014
Corpus ID: 14553804
The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies the proximity of two…
Expand
Highly Cited
2010
Highly Cited
2010
Optimism in reinforcement learning and Kullback-Leibler divergence
S. Filippi
,
O. Cappé
,
Aurélien Garivier
Allerton Conference on Communication, Control…
2010
Corpus ID: 3579832
We consider model-based reinforcement learning in finite Markov Decision Processes (MDPs), focussing on so-called optimistic…
Expand
Highly Cited
2008
Highly Cited
2008
Similarity Measures for Text Document Clustering
A. Huang
2008
Corpus ID: 8616082
Clustering is a useful technique that organizes a large quantity of unordered text documents into a small number of meaningful…
Expand
Highly Cited
2005
Highly Cited
2005
Divergence measures and message passing
T. Minka
2005
Corpus ID: 7585417
This paper presents a unifying view of messagepassing algorithms, as methods to approximate a complex Bayesian network by a…
Expand
By clicking accept or continuing to use the site, you agree to the terms outlined in our
Privacy Policy
(opens in a new tab)
,
Terms of Service
(opens in a new tab)
, and
Dataset License
(opens in a new tab)
ACCEPT & CONTINUE
or Only Accept Required