Skip to search formSkip to main contentSkip to account menuSemantic Scholar Semantic Scholar's Logo Search 210,023,935 papers from all fields of science

Search

Semantic Scholar uses AI to extract papers important to this topic.

2018

2018

When one of the random summands is Gaussian, we sharpen the entropy power inequality (EPI) in terms of the strong data processing…

2017

2017

An extension of the entropy power inequality to the form <inline-formula> <tex-math notation="LaTeX">$N_{r}^\alpha (X+Y) \geq N_…

Review

2016

Review

2016

The entropy power inequality, which plays a fundamental role in information theory and probability, may be seen as an analogue of…

2016

2016

This paper gives improved Rényi entropy power inequalities (R-EPIs). Consider a sum S<sub>n</sub> = Σ<sub>k=1</sub><sup>n</sup…

Highly Cited

2013

Highly Cited

2013

A lower bound on the Rényi differential entropy of a sum of independent random vectors is demonstrated in terms of…

Highly Cited

2007

Highly Cited

2007

While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information…

Highly Cited

2006

Highly Cited

2006

This correspondence gives a simple proof of Shannon's entropy-power inequality (EPI) using the relationship between mutual…

Review

1991

Review

1991

The role of inequalities in information theory is reviewed, and the relationship of these inequalities to inequalities in other…

Review

1991

Review

1991

Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the…

Highly Cited

1985

Highly Cited

1985

A strengthened version of Shannon's entropy power inequality for the case where one of the random vectors involved is Gaussian is…