# Minimum MS. E. Gerber’s Lemma

@article{Ordentlich2015MinimumME, title={Minimum MS. E. Gerber’s Lemma}, author={Or Ordentlich and Ofer Shayevitz}, journal={IEEE Transactions on Information Theory}, year={2015}, volume={61}, pages={5883-5891} }

Mrs. Gerber's Lemma lower bounds the entropy at the output of a binary symmetric channel in terms of the entropy of the input process. In this paper, we lower bound the output entropy via a different measure of input uncertainty, pertaining to the minimum mean squared error prediction cost of the input process. We show that in many cases our bound is tighter than the one obtained from Mrs. Gerber's Lemma. As an application, we evaluate the bound for binary hidden Markov processes, and obtain…

## 3 Citations

Novel lower bounds on the entropy rate of binary hidden Markov processes

- Mathematics, Computer Science2016 IEEE International Symposium on Information Theory (ISIT)
- 2016

Recently, Samorodnitsky proved a strengthened version of Mrs. Gerber's Lemma, where the output entropy of a binary symmetric channel is bounded in terms of the average entropy of the input projected…

Bounds on Information Combining with Quantum Side Information

- Computer Science, Mathematics2018 IEEE International Symposium on Information Theory (ISIT)
- 2018

This work applies bounds to Polar coding for classical-quantum channels, and shows that even non-stationary channels polarize, and the blocklength required to approach the symmetric capacity scales at most sub-exponentially in the gap to capacity and under the lower bound conjecture a blocklength polynomial in the gaps suffices.

Bounds on Information Combining With Quantum Side Information

- Mathematics, Computer ScienceIEEE Transactions on Information Theory
- 2018

This work investigates the problem in the setting where quantum side information is available, which has been recognized as a hard setting for entropy power inequalities, and presents a non-trivial, and close to optimal, lower bound on the combined entropy.

## References

SHOWING 1-10 OF 17 REFERENCES

Entropy inequalities for discrete channels

- Mathematics, Computer ScienceIEEE Trans. Inf. Theory
- 1974

The sharp lower bound f(x) on the per-symbol output entropy for a given per-symbol input entropy x is determined for stationary discrete memoryless channels; it is the lower convex envelope of the…

The Entropy Power Inequality and Mrs. Gerber's Lemma for Groups of Order 2n

- Computer Science, MathematicsIEEE Trans. Inf. Theory
- 2014

An entropy power inequality for random variables taking values in a group of order $2^{n}$ is obtained, and it turns out that f_{G}(x,y) is convex in x for fixed y and, by symmetry, conveX in y for fixed x, and this is a generalization of Mrs. Gerber's Lemma.

Extension of an entropy property for binary input memoryless symmetric channels

- Mathematics, Computer ScienceIEEE Trans. Inf. Theory
- 1989

Using the interpretation of entropy as a measure of order and randomness, the authors deduce that output sequences of memoryless symmetric channels induced by binary inputs are of a higher degree of randomness if the redundancy of the input binary sequence is spread in memory rather than in one-dimensional asymmetry.

Asymptotic filtering and entropy rate of a hidden Markov process in the rare transitions regime

- Mathematics, Computer ScienceProceedings. International Symposium on Information Theory, 2005. ISIT 2005.
- 2005

This paper restricts its attention to the case of a two state Markov chain that is corrupted by a binary symmetric channel, and uses this approach to obtain tight estimates of the entropy rate of the process in the rare transitions regime.

A theorem on the entropy of certain binary sequences and applications-II

- Mathematics, Computer ScienceIEEE Trans. Inf. Theory
- 1973

A theorem concerning the entropy of a certain sequence of binary random variables is established and this result is applied to the solution of three problems in multi-user communication.

A binary analog to the entropy-power inequality

- Mathematics, Computer ScienceIEEE Trans. Inf. Theory
- 1990

When (Y/sub n/) are independent identically distributed, this reduces to Mrs. Gerber's Lemma from A.D. Wyner and J. Ziv (1973).

Entropy Rate for Hidden Markov Chains with rare transitions

- Mathematics, Computer ScienceArXiv
- 2010

We consider Hidden Markov Chains obtained by passing a Markov Chain with rare transitions through a noisy memoryless channel. We obtain asymptotic estimates for the entropy of the resulting Hidden…

Elements of Information Theory

- Engineering, Computer Science
- 1991

The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.

The common information of two dependent random variables

- Mathematics, Computer ScienceIEEE Trans. Inf. Theory
- 1975

The main result of the paper is contained in two theorems which show that C(X; Y) is i) the minimum R_0 such that a sequence of independent copies of (X,Y) can be efficiently encoded into three binary streams W_0, W_1,W_2 with rates R-0, R-1,R-2.

Network Information Theory

- Computer Science
- 2011

Key topics covered include successive cancellation and superposition coding, MIMO wireless communication, network coding, and cooperative relaying, and asynchronous and random access channels.