Skip to search formSkip to main content
You are currently offline. Some features of the site may not work correctly.

Entropy (information theory)

Known as: Shannon's entropy, Weighted entropy, Informational entropy 
In information theory, systems are modeled by a transmitter, channel, and receiver. The transmitter produces messages that are sent through the… Expand
Wikipedia

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2007
Highly Cited
2007
  • X. Qu
  • Technometrics
  • 2007
  • Corpus ID: 35520774
Chapter 3 deals with probability distributions, discrete and continuous densities, distribution functions, bivariate… Expand
Highly Cited
2004
Highly Cited
2004
Rough set theory is a relatively new mathematical tool for use in computer applications in circumstances which are characterized… Expand
Review
1997
Review
1997
We assume that H(f) is well-defined and is finite. The concept of differential entropy was introduced in Shannon’s original paper… Expand
Highly Cited
1994
Highly Cited
1994
Abstract In statistical physics, useful notions of entropy are defined with respect to some coarse-graining procedure over a… Expand
Highly Cited
1992
Highly Cited
1992
Adapted waveform analysis uses a library of orthonormal bases and an efficiency functional to match a basis to a given signal or… Expand
  • figure 2
  • figure 3
  • figure 5
  • figure 6
  • figure 7
Review
1991
Review
1991
Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the… Expand
Highly Cited
1991
Highly Cited
1991
  • J. Lin
  • IEEE Trans. Inf. Theory
  • 1991
  • Corpus ID: 12121632
A novel class of information-theoretic divergence measures based on the Shannon entropy is introduced. Unlike the well-known… Expand
  • figure 1
  • figure 2
Highly Cited
1986
Highly Cited
1986
  • B. Kosko
  • Inf. Sci.
  • 1986
  • Corpus ID: 205003915
Abstract A new nonprobabilistic entropy measure is introduced in the context of fuzzy sets or messages. Fuzzy units, or fits… Expand
Highly Cited
1980
Highly Cited
1980
Jaynes's principle of maximum entropy and Kullbacks principle of minimum cross-entropy (minimum directed divergence) are shown to… Expand
Review
1977
Review
1977
1. Entropy and mutual information 2. Discrete memoryless channels and their capacity-cost functions 3. Discrete memoryless… Expand