Skip to search formSkip to main contentSkip to account menu

Entropy (information theory)

Known as: Shannon's entropy, Weighted entropy, Informational entropy 
In information theory, systems are modeled by a transmitter, channel, and receiver. The transmitter produces messages that are sent through the… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2017
Highly Cited
2017
We present a new approach to learn compressible representations in deep architectures with an end-to-end training strategy. Our… 
Highly Cited
2009
Highly Cited
2009
Information plays an important role in our understanding of the physical world. Hence we propose an entropic measure of… 
Highly Cited
2007
Highly Cited
2007
While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information… 
Highly Cited
2006
Highly Cited
2006
The performance of ad hoc networks depends on cooperation and trust among distributed nodes. To enhance security in ad hoc… 
Review
2001
Review
2001
This paper presents a survey about different types of fuzzy information measures. A number of schemes have been proposed to… 
Highly Cited
2001
Highly Cited
2001
Energy, entropy and exergy concepts come from thermodynamics and are applicable to all fields of science and engineering… 
Review
1989
Review
1989
The definition of Shannon's entropy in the context of information theory is critically examined and some of its applications to… 
Highly Cited
1986
Highly Cited
1986