Skip to search formSkip to main contentSkip to account menu>Semantic Scholar Semantic Scholar's Logo

Search

You are currently offline. Some features of the site may not work correctly.

Semantic Scholar uses AI to extract papers important to this topic.

2018

2018

When one of the random summands is Gaussian, we sharpen the entropy power inequality (EPI) in terms of the strong data processing…

2017

2017

An extension of the entropy power inequality to the form <inline-formula> <tex-math notation="LaTeX">$N_{r}^\alpha (X+Y) \geq N_…

Review

2016

Review

2016

The entropy power inequality, which plays a fundamental role in information theory and probability, may be seen as an analogue of…

2016

2016

This paper gives improved Rényi entropy power inequalities (R-EPIs). Consider a sum S<sub>n</sub> = Σ<sub>k=1</sub><sup>n</sup…

Highly Cited

2015

Highly Cited

2015

The classical entropy power inequality is extended to the Rényi entropy. We also discuss the question of the existence of the…

Highly Cited

2014

Highly Cited

2014

When two independent analog signals, X and Y are added together giving Z=X+Y, the entropy of Z, H(Z), is not a simple function of…

Highly Cited

2011

Highly Cited

2011

While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information…

Review

1991

Review

1991

Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the…

Review

1991

Review

1991

The role of inequalities in information theory is reviewed, and the relationship of these inequalities to inequalities in other…

Highly Cited

1985

Highly Cited

1985

A strengthened version of Shannon's entropy power inequality for the case where one of the random vectors involved is Gaussian is…