Skip to search formSkip to main content

You are currently offline. Some features of the site may not work correctly.

Semantic Scholar uses AI to extract papers important to this topic.

2018

2018

When one of the random summands is Gaussian, we sharpen the entropy power inequality (EPI) in terms of the strong data processing… Expand

Is this relevant?

2016

2016

We tighten the entropy power inequality (EPI) when one of the random summands is Gaussian. Our strengthening is closely related… Expand

Is this relevant?

2016

2016

This paper gives improved Rényi entropy power inequalities (R-EPIs). Consider a sum S<sub>n</sub> = Σ<sub>k=1</sub><sup>n</sup… Expand

Is this relevant?

Highly Cited

2015

Highly Cited

2015

The classical entropy power inequality is extended to the Rényi entropy. We also discuss the question of the existence of the… Expand

Is this relevant?

2014

2014

A lower bound on the Rényi differential entropy of a sum of independent random vectors is demonstrated in terms of… Expand

Is this relevant?

2014

2014

When two independent analog signals, X and Y are added together giving Z=X+Y, the entropy of Z, H(Z), is not a simple function of… Expand

Is this relevant?

Highly Cited

2011

Highly Cited

2011

While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information… Expand

Is this relevant?

Highly Cited

2006

Highly Cited

2006

This correspondence gives a simple proof of Shannon's entropy-power inequality (EPI) using the relationship between mutual… Expand

Is this relevant?

Highly Cited

1985

Highly Cited

1985

A strengthened version of Shannon's entropy power inequality for the case where one of the random vectors involved is Gaussian is… Expand

Is this relevant?

Highly Cited

1984

Highly Cited

1984

The entropy power inequality states that the effective variance (entropy power) of the sum of two independent random variables is… Expand

Is this relevant?