Minimum MS. E. Gerber’s Lemma

@article{Ordentlich2015MinimumME,
  title={Minimum MS. E. Gerber’s Lemma},
  author={Or Ordentlich and Ofer Shayevitz},
  journal={IEEE Transactions on Information Theory},
  year={2015},
  volume={61},
  pages={5883-5891}
}
Mrs. Gerber's Lemma lower bounds the entropy at the output of a binary symmetric channel in terms of the entropy of the input process. In this paper, we lower bound the output entropy via a different measure of input uncertainty, pertaining to the minimum mean squared error prediction cost of the input process. We show that in many cases our bound is tighter than the one obtained from Mrs. Gerber's Lemma. As an application, we evaluate the bound for binary hidden Markov processes, and obtain… 
Novel lower bounds on the entropy rate of binary hidden Markov processes
  • Or Ordentlich
  • Mathematics, Computer Science
    2016 IEEE International Symposium on Information Theory (ISIT)
  • 2016
Recently, Samorodnitsky proved a strengthened version of Mrs. Gerber's Lemma, where the output entropy of a binary symmetric channel is bounded in terms of the average entropy of the input projected
Bounds on Information Combining with Quantum Side Information
  • C. Hirche, D. Reeb
  • Computer Science, Mathematics
    2018 IEEE International Symposium on Information Theory (ISIT)
  • 2018
TLDR
This work applies bounds to Polar coding for classical-quantum channels, and shows that even non-stationary channels polarize, and the blocklength required to approach the symmetric capacity scales at most sub-exponentially in the gap to capacity and under the lower bound conjecture a blocklength polynomial in the gaps suffices.
Bounds on Information Combining With Quantum Side Information
  • C. Hirche, D. Reeb
  • Mathematics, Computer Science
    IEEE Transactions on Information Theory
  • 2018
TLDR
This work investigates the problem in the setting where quantum side information is available, which has been recognized as a hard setting for entropy power inequalities, and presents a non-trivial, and close to optimal, lower bound on the combined entropy.

References

SHOWING 1-10 OF 17 REFERENCES
Entropy inequalities for discrete channels
The sharp lower bound f(x) on the per-symbol output entropy for a given per-symbol input entropy x is determined for stationary discrete memoryless channels; it is the lower convex envelope of the
The Entropy Power Inequality and Mrs. Gerber's Lemma for Groups of Order 2n
TLDR
An entropy power inequality for random variables taking values in a group of order $2^{n}$ is obtained, and it turns out that f_{G}(x,y) is convex in x for fixed y and, by symmetry, conveX in y for fixed x, and this is a generalization of Mrs. Gerber's Lemma.
Extension of an entropy property for binary input memoryless symmetric channels
TLDR
Using the interpretation of entropy as a measure of order and randomness, the authors deduce that output sequences of memoryless symmetric channels induced by binary inputs are of a higher degree of randomness if the redundancy of the input binary sequence is spread in memory rather than in one-dimensional asymmetry.
Asymptotic filtering and entropy rate of a hidden Markov process in the rare transitions regime
TLDR
This paper restricts its attention to the case of a two state Markov chain that is corrupted by a binary symmetric channel, and uses this approach to obtain tight estimates of the entropy rate of the process in the rare transitions regime.
A theorem on the entropy of certain binary sequences and applications-II
  • A. Wyner
  • Mathematics, Computer Science
    IEEE Trans. Inf. Theory
  • 1973
TLDR
A theorem concerning the entropy of a certain sequence of binary random variables is established and this result is applied to the solution of three problems in multi-user communication.
A binary analog to the entropy-power inequality
TLDR
When (Y/sub n/) are independent identically distributed, this reduces to Mrs. Gerber's Lemma from A.D. Wyner and J. Ziv (1973).
Entropy Rate for Hidden Markov Chains with rare transitions
We consider Hidden Markov Chains obtained by passing a Markov Chain with rare transitions through a noisy memoryless channel. We obtain asymptotic estimates for the entropy of the resulting Hidden
Elements of Information Theory
TLDR
The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
The common information of two dependent random variables
  • A. Wyner
  • Mathematics, Computer Science
    IEEE Trans. Inf. Theory
  • 1975
TLDR
The main result of the paper is contained in two theorems which show that C(X; Y) is i) the minimum R_0 such that a sequence of independent copies of (X,Y) can be efficiently encoded into three binary streams W_0, W_1,W_2 with rates R-0, R-1,R-2.
Network Information Theory
TLDR
Key topics covered include successive cancellation and superposition coding, MIMO wireless communication, network coding, and cooperative relaying, and asynchronous and random access channels.
...
1
2
...