Skip to search formSkip to main contentSkip to account menu

Differential entropy

Known as: Continuous entropy 
Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Shannon to extend the… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2018
Highly Cited
2018
Estimating how uncertain an AI system is in its predictions is important to improve the safety of such systems. Uncertainty in… 
Review
2014
Review
2014
Quantifying and assessing changes in biological diversity are central aspects of many ecological studies, yet accurate methods of… 
Highly Cited
2013
Highly Cited
2013
EEG-based emotion recognition has been studied for a long time. In this paper, a new effective EEG feature named differential… 
Highly Cited
2008
Highly Cited
2008
For many practical probability density representations such as for the widely used Gaussian mixture densities, an analytic… 
Highly Cited
2003
Highly Cited
2003
We present a conceptually simple method for hierarchical clustering of data called mutual information clustering (MIC) algorithm… 
Review
1997
Review
1997
We assume that H(f) is well-defined and is finite. The concept of differential entropy was introduced in Shannon’s original paper… 
Highly Cited
1997
Highly Cited
1997
We derive a first-order approximation of the density of maximum entropy for a continuous 1-D random variable, given a number of… 
Review
1996
Review
1996
Review
1991
Review
1991
Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the… 
Highly Cited
1978
Highly Cited
1978
A table is given of differential entropies for various continuous probability distributions. The formulas, some of which are new…