Skip to search form
Skip to main content
Skip to account menu
Semantic Scholar
Semantic Scholar's Logo
Search 210,023,935 papers from all fields of science
Search
Sign In
Create Free Account
Entropy power inequality
Known as:
EPI
In mathematics, the entropy power inequality is a result in information theory that relates to so-called "entropy power" of random variables. It…
Expand
Wikipedia
(opens in a new tab)
Create Alert
Alert
Related topics
Related topics
11 relations
A Mathematical Theory of Communication
Conditional entropy
Differential entropy
Entropy (information theory)
Expand
Broader (1)
Information theory
Papers overview
Semantic Scholar uses AI to extract papers important to this topic.
2018
2018
A Strong Entropy Power Inequality
T. Courtade
IEEE Transactions on Information Theory
2018
Corpus ID: 3988506
When one of the random summands is Gaussian, we sharpen the entropy power inequality (EPI) in terms of the strong data processing…
Expand
2017
2017
Variants of the Entropy Power Inequality
S. Bobkov
,
Arnaud Marsiglietti
IEEE Transactions on Information Theory
2017
Corpus ID: 585318
An extension of the entropy power inequality to the form <inline-formula> <tex-math notation="LaTeX">$N_{r}^\alpha (X+Y) \geq N_…
Expand
Review
2016
Review
2016
Forward and Reverse Entropy Power Inequalities in Convex Geometry
M. Madiman
,
J. Melbourne
,
Peng Xu
ArXiv
2016
Corpus ID: 1825252
The entropy power inequality, which plays a fundamental role in information theory and probability, may be seen as an analogue of…
Expand
2016
2016
On Rényi Entropy Power Inequalities
Eshed Ram
,
I. Sason
IEEE Transactions on Information Theory
2016
Corpus ID: 206985119
This paper gives improved Rényi entropy power inequalities (R-EPIs). Consider a sum S<sub>n</sub> = Σ<sub>k=1</sub><sup>n</sup…
Expand
Highly Cited
2013
Highly Cited
2013
Beyond the Entropy Power Inequality, via Rearrangements
Liyao Wang
,
M. Madiman
IEEE Transactions on Information Theory
2013
Corpus ID: 8487940
A lower bound on the Rényi differential entropy of a sum of independent random vectors is demonstrated in terms of…
Expand
Highly Cited
2007
Highly Cited
2007
Information Theoretic Proofs of Entropy Power Inequalities
O. Rioul
IEEE Transactions on Information Theory
2007
Corpus ID: 2764882
While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information…
Expand
Highly Cited
2006
Highly Cited
2006
A simple proof of the entropy-power inequality
S. Verdú
,
Dongning Guo
IEEE Transactions on Information Theory
2006
Corpus ID: 10040346
This correspondence gives a simple proof of Shannon's entropy-power inequality (EPI) using the relationship between mutual…
Expand
Review
1991
Review
1991
Information theoretic inequalities
A. Dembo
,
T. Cover
,
Joy A. Thomas
IEEE Transactions on Information Theory
1991
Corpus ID: 845669
The role of inequalities in information theory is reviewed, and the relationship of these inequalities to inequalities in other…
Expand
Review
1991
Review
1991
Elements of Information Theory
T. Cover
,
Joy A. Thomas
1991
Corpus ID: 190432
Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the…
Expand
Highly Cited
1985
Highly Cited
1985
A new entropy power inequality
M. H. M. Costa
IEEE Transactions on Information Theory
1985
Corpus ID: 40175871
A strengthened version of Shannon's entropy power inequality for the case where one of the random vectors involved is Gaussian is…
Expand
By clicking accept or continuing to use the site, you agree to the terms outlined in our
Privacy Policy
(opens in a new tab)
,
Terms of Service
(opens in a new tab)
, and
Dataset License
(opens in a new tab)
ACCEPT & CONTINUE