Skip to search formSkip to main contentSkip to account menu

Entropy (information theory)

Known as: Shannon's entropy, Weighted entropy, Informational entropy 
In information theory, systems are modeled by a transmitter, channel, and receiver. The transmitter produces messages that are sent through the… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2007
Highly Cited
2007
  • X. Qu
  • Technometrics
  • 2007
  • Corpus ID: 35520774
Chapter 3 deals with probability distributions, discrete and continuous densities, distribution functions, bivariate… 
Highly Cited
2006
Highly Cited
2006
Rough set theory is a relatively new mathematical tool for use in computer applications in circumstances that are characterized… 
Highly Cited
2004
Highly Cited
2004
Rough set theory is a relatively new mathematical tool for use in computer applications in circumstances which are characterized… 
Highly Cited
2002
Highly Cited
2002
This paper analyzes the information view of rough set theory and compares it with the algebra view of rough set theory. Some… 
Review
1997
Review
1997
We assume that H(f) is well-defined and is finite. The concept of differential entropy was introduced in Shannon’s original paper… 
Highly Cited
1992
Highly Cited
1992
Adapted waveform analysis uses a library of orthonormal bases and an efficiency functional to match a basis to a given signal or… 
Review
1991
Review
1991
Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the… 
Highly Cited
1991
Highly Cited
1991
  • Jianhua Lin
  • IEEE Transactions on Information Theory
  • 1991
  • Corpus ID: 12121632
A novel class of information-theoretic divergence measures based on the Shannon entropy is introduced. Unlike the well-known… 
Highly Cited
1980
Highly Cited
1980
Jaynes's principle of maximum entropy and Kullbacks principle of minimum cross-entropy (minimum directed divergence) are shown to… 
Review
1979
Review
1979
1. Entropy and mutual information 2. Discrete memoryless channels and their capacity-cost functions 3. Discrete memoryless…