Skip to search formSkip to main contentSkip to account menu

Entropy power inequality

Known as: EPI 
In mathematics, the entropy power inequality is a result in information theory that relates to so-called "entropy power" of random variables. It… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2018
2018
When one of the random summands is Gaussian, we sharpen the entropy power inequality (EPI) in terms of the strong data processing… 
2018
2018
We establish quantitative stability results for the entropy power inequality (EPI). Specifically, we show that if uniformly log… 
2017
2017
This paper is twofold. In the first part, we present a refinement of the R\'enyi Entropy Power Inequality (EPI) recently obtained… 
2016
2016
An extension of the entropy power inequality to the form <inline-formula> <tex-math notation="LaTeX">$N_{r}^\alpha (X+Y) \geq N_… 
2015
2015
Shannon’s entropy power inequality (EPI) can be viewed as a statement of concavity of an entropic function of a continuous random… 
2015
2015
We analyse an analog of the entropy-power inequality for the weighted entropy. In particular, we discuss connections with… 
Highly Cited
2015
Highly Cited
2015
The classical entropy power inequality is extended to the Rényi entropy. We also discuss the question of the existence of the… 
Highly Cited
2013
Highly Cited
2013
A lower bound on the Rényi differential entropy of a sum of independent random vectors is demonstrated in terms of… 
Highly Cited
2012
Highly Cited
2012
When two independent analog signals, X and Y are added together giving Z=X+Y, the entropy of Z, H(Z), is not a simple function of… 
Highly Cited
2007
Highly Cited
2007
While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information…