Skip to search form
Skip to main content
Skip to account menu
Semantic Scholar
Semantic Scholar's Logo
Search 210,178,537 papers from all fields of science
Search
Sign In
Create Free Account
Entropy (information theory)
Known as:
Shannon's entropy
, Weighted entropy
, Informational entropy
Expand
In information theory, systems are modeled by a transmitter, channel, and receiver. The transmitter produces messages that are sent through the…
Expand
Wikipedia
(opens in a new tab)
Create Alert
Alert
Related topics
Related topics
50 relations
A Mathematical Theory of Communication
Anti-information
Complex network
Cross entropy
Expand
Broader (1)
Information theory
Papers overview
Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2007
Highly Cited
2007
Multivariate Data Analysis
X. Qu
Technometrics
2007
Corpus ID: 35520774
Chapter 3 deals with probability distributions, discrete and continuous densities, distribution functions, bivariate…
Expand
Highly Cited
2006
Highly Cited
2006
Information entropy, rough entropy and knowledge granulation in incomplete information systems
Jiye Liang
,
Z. Shi
,
Deyu Li
,
M. J. Wierman
International Journal of General Systems
2006
Corpus ID: 15943419
Rough set theory is a relatively new mathematical tool for use in computer applications in circumstances that are characterized…
Expand
Highly Cited
2004
Highly Cited
2004
The Information Entropy, Rough Entropy And Knowledge Granulation In Rough Set Theory
Jiye Liang
,
Zhongzhi Shi
Int. J. Uncertain. Fuzziness Knowl. Based Syst.
2004
Corpus ID: 44637939
Rough set theory is a relatively new mathematical tool for use in computer applications in circumstances which are characterized…
Expand
Highly Cited
2002
Highly Cited
2002
Decision Table Reduction based on Conditional Information Entropy
Wang Guo
2002
Corpus ID: 123732606
This paper analyzes the information view of rough set theory and compares it with the algebra view of rough set theory. Some…
Expand
Review
1997
Review
1997
Nonparametric entropy estimation. An overview
J. Beirlant
,
E. Dudewicz
,
L. Györfi
,
I. Denes
1997
Corpus ID: 5722994
We assume that H(f) is well-defined and is finite. The concept of differential entropy was introduced in Shannon’s original paper…
Expand
Highly Cited
1992
Highly Cited
1992
Entropy-based algorithms for best basis selection
R. Coifman
,
M. Wickerhauser
IEEE Transactions on Information Theory
1992
Corpus ID: 546882
Adapted waveform analysis uses a library of orthonormal bases and an efficiency functional to match a basis to a given signal or…
Expand
Review
1991
Review
1991
Elements of Information Theory
T. Cover
,
Joy A. Thomas
1991
Corpus ID: 190432
Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the…
Expand
Highly Cited
1991
Highly Cited
1991
Divergence measures based on the Shannon entropy
Jianhua Lin
IEEE Transactions on Information Theory
1991
Corpus ID: 12121632
A novel class of information-theoretic divergence measures based on the Shannon entropy is introduced. Unlike the well-known…
Expand
Highly Cited
1980
Highly Cited
1980
Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy
J. Shore
,
Rodney W. Johnson
IEEE Transactions on Information Theory
1980
Corpus ID: 6404642
Jaynes's principle of maximum entropy and Kullbacks principle of minimum cross-entropy (minimum directed divergence) are shown to…
Expand
Review
1979
Review
1979
Theory of Information and Coding
R. McEliece
1979
Corpus ID: 60000416
1. Entropy and mutual information 2. Discrete memoryless channels and their capacity-cost functions 3. Discrete memoryless…
Expand
By clicking accept or continuing to use the site, you agree to the terms outlined in our
Privacy Policy
(opens in a new tab)
,
Terms of Service
(opens in a new tab)
, and
Dataset License
(opens in a new tab)
ACCEPT & CONTINUE