Skip to search formSkip to main content
You are currently offline. Some features of the site may not work correctly.

Entropy power inequality

Known as: EPI 
In mathematics, the entropy power inequality is a result in information theory that relates to so-called "entropy power" of random variables. It… Expand
Wikipedia

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2018
2018
When one of the random summands is Gaussian, we sharpen the entropy power inequality (EPI) in terms of the strong data processing… Expand
Is this relevant?
2016
2016
We tighten the entropy power inequality (EPI) when one of the random summands is Gaussian. Our strengthening is closely related… Expand
Is this relevant?
2016
2016
This paper gives improved Rényi entropy power inequalities (R-EPIs). Consider a sum S<sub>n</sub> = Σ<sub>k=1</sub><sup>n</sup… Expand
  • figure 1
  • figure 2
Is this relevant?
Highly Cited
2015
Highly Cited
2015
The classical entropy power inequality is extended to the Rényi entropy. We also discuss the question of the existence of the… Expand
Is this relevant?
2014
2014
A lower bound on the Rényi differential entropy of a sum of independent random vectors is demonstrated in terms of… Expand
  • figure 1
Is this relevant?
2014
2014
When two independent analog signals, X and Y are added together giving Z=X+Y, the entropy of Z, H(Z), is not a simple function of… Expand
Is this relevant?
Highly Cited
2011
Highly Cited
2011
  • Olivier Rioul
  • IEEE Transactions on Information Theory
  • 2011
  • Corpus ID: 2764882
While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information… Expand
  • figure 1
  • figure 2
  • figure 3
Is this relevant?
Highly Cited
2006
Highly Cited
2006
This correspondence gives a simple proof of Shannon's entropy-power inequality (EPI) using the relationship between mutual… Expand
Is this relevant?
Highly Cited
1985
Highly Cited
1985
A strengthened version of Shannon's entropy power inequality for the case where one of the random vectors involved is Gaussian is… Expand
Is this relevant?
Highly Cited
1984
Highly Cited
1984
The entropy power inequality states that the effective variance (entropy power) of the sum of two independent random variables is… Expand
Is this relevant?