Skip to search formSkip to main contentSkip to account menu

Kullback–Leibler divergence

Known as: Kl-divergence, KL-distance, Kullback divergence 
In probability theory and information theory, the Kullback–Leibler divergence, also called discrimination information (the name preferred by Kullback… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2020
2020
We propose a method to fuse posterior distributions learned from heterogeneous datasets. Our algorithm relies on a mean field… 
2019
2019
The Kullback–Leibler divergence (KLD) between two multivariate generalized Gaussian distributions (MGGDs) is a fundamental tool… 
2019
2019
This paper addresses the linear and log-linear fusion approaches to multitarget density fusion which yield arithmetic average (AA… 
2018
2018
Complex simulator-based models usually have intractable likelihood functions, rendering the likelihood-based inference methods… 
Highly Cited
2017
Highly Cited
2017
Two geometrical structures have been extensively studied for a manifold of probability distributions. One is based on the Fisher… 
Highly Cited
2015
Highly Cited
2015
This paper deals with the problem of fault detection and diagnosis of induction motor based on motor current signature analysis… 
Highly Cited
2013
Highly Cited
2013
In a variety of applications it is important to extract information from a probability measure $\mu$ on an infinite dimensional… 
Highly Cited
2010
Highly Cited
2010
We consider model-based reinforcement learning in finite Markov Decision Processes (MDPs), focussing on so-called optimistic… 
Highly Cited
2007
Highly Cited
2007
When there are a number of stochastic models in the form of probability distributions, one needs to integrate them. Mixtures of… 
Highly Cited
2007
Highly Cited
2007
The partial occlusion is one of the key issues in the face recognition community. To resolve the problem of partial occlusion…