Skip to search form
Skip to main content
Skip to account menu
Semantic Scholar
Semantic Scholar's Logo
Search 218,217,200 papers from all fields of science
Search
Sign In
Create Free Account
Entropy power inequality
Known as:
EPI
In mathematics, the entropy power inequality is a result in information theory that relates to so-called "entropy power" of random variables. It…
Expand
Wikipedia
(opens in a new tab)
Create Alert
Alert
Related topics
Related topics
11 relations
A Mathematical Theory of Communication
Conditional entropy
Differential entropy
Entropy (information theory)
Expand
Broader (1)
Information theory
Papers overview
Semantic Scholar uses AI to extract papers important to this topic.
2018
2018
A Strong Entropy Power Inequality
T. Courtade
IEEE Transactions on Information Theory
2018
Corpus ID: 3988506
When one of the random summands is Gaussian, we sharpen the entropy power inequality (EPI) in terms of the strong data processing…
Expand
2018
2018
Quantitative Stability of the Entropy Power Inequality
T. Courtade
,
M. Fathi
,
A. Pananjady
IEEE Transactions on Information Theory
2018
Corpus ID: 49721623
We establish quantitative stability results for the entropy power inequality (EPI). Specifically, we show that if uniformly log…
Expand
2017
2017
Rényi entropy power inequality and a reverse
Jiange Li
arXiv.org
2017
Corpus ID: 7851932
This paper is twofold. In the first part, we present a refinement of the R\'enyi Entropy Power Inequality (EPI) recently obtained…
Expand
2016
2016
Variants of the Entropy Power Inequality
S. Bobkov
,
Arnaud Marsiglietti
IEEE Transactions on Information Theory
2016
Corpus ID: 585318
An extension of the entropy power inequality to the form <inline-formula> <tex-math notation="LaTeX">$N_{r}^\alpha (X+Y) \geq N_…
Expand
2015
2015
Entropy power inequalities for qudits
K. Audenaert
,
N. Datta
,
M. Ozols
arXiv.org
2015
Corpus ID: 14203863
Shannon’s entropy power inequality (EPI) can be viewed as a statement of concavity of an entropic function of a continuous random…
Expand
2015
2015
Entropy-power inequality for weighted entropy
Y. Suhov
,
S. Y. Sekeh
arXiv.org
2015
Corpus ID: 14221303
We analyse an analog of the entropy-power inequality for the weighted entropy. In particular, we discuss connections with…
Expand
Highly Cited
2015
Highly Cited
2015
Entropy Power Inequality for the Rényi Entropy
S. Bobkov
,
G. Chistyakov
IEEE Transactions on Information Theory
2015
Corpus ID: 1266063
The classical entropy power inequality is extended to the Rényi entropy. We also discuss the question of the existence of the…
Expand
Highly Cited
2013
Highly Cited
2013
Beyond the Entropy Power Inequality, via Rearrangements
Liyao Wang
,
M. Madiman
IEEE Transactions on Information Theory
2013
Corpus ID: 8487940
A lower bound on the Rényi differential entropy of a sum of independent random vectors is demonstrated in terms of…
Expand
Highly Cited
2012
Highly Cited
2012
The Entropy Power Inequality for Quantum Systems
R. König
,
Graeme Smith
IEEE Transactions on Information Theory
2012
Corpus ID: 7081072
When two independent analog signals, X and Y are added together giving Z=X+Y, the entropy of Z, H(Z), is not a simple function of…
Expand
Highly Cited
2007
Highly Cited
2007
Information Theoretic Proofs of Entropy Power Inequalities
O. Rioul
IEEE Transactions on Information Theory
2007
Corpus ID: 2764882
While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information…
Expand
By clicking accept or continuing to use the site, you agree to the terms outlined in our
Privacy Policy
(opens in a new tab)
,
Terms of Service
(opens in a new tab)
, and
Dataset License
(opens in a new tab)
ACCEPT & CONTINUE