# Communication requirements for generating correlated random variables

@article{Cuff2008CommunicationRF, title={Communication requirements for generating correlated random variables}, author={Paul W. Cuff}, journal={2008 IEEE International Symposium on Information Theory}, year={2008}, pages={1393-1397} }

Two familiar notions of correlation are re-discovered as extreme operating points for simulating a discrete memoryless channel, in which a channel output is generated based only on a description of the channel input. Wynerpsilas ldquocommon informationrdquo coincides with the minimum description rate needed. However, when common randomness independent of the input is available, the necessary description rate reduces to Shannonpsilas mutual information. This work characterizes the optimal…

## 78 Citations

### Distributed Channel Synthesis

- Computer ScienceIEEE Transactions on Information Theory
- 2013

This paper characterizes the optimal tradeoff between the amount of common randomness used and the required rate of description and generalizes and strengthens a soft covering lemma, known in the literature for its role in quantifying the resolvability of a channel.

### Minimal public communication for maximum rate secret key generation

- Computer Science2011 IEEE International Symposium on Information Theory Proceedings
- 2011

It is argued that optimum ratesecret key generation is linked inherently to the Wyner's notion of common information between two dependent random variables and the minimum rate of interactive public communication required to generate an optimum rate secret key is characterized in terms of a variant of this notion ofcommon information.

### Output Constrained Lossy Source Coding With Limited Common Randomness

- Computer ScienceIEEE Transactions on Information Theory
- 2015

This paper studies a Shannon-theoretic version of the generalized distribution preserving quantization problem where a stationary and memoryless source is encoded subject to a distortion constraint…

### Sufficient conditions for the equality of exact and Wyner common information

- Computer Science2016 54th Annual Allerton Conference on Communication, Control, and Computing (Allerton)
- 2016

Though the conditions are implicit, this work proves the equality of Wyner and exact common information for the generalized binary Z-source, generalized erasure source and the noisy typewriter source by establishing that these sources meet either of these conditions.

### Appendix B: Computational complexity of reverse channel coding

- Computer Science
- 2020

No polynomial time algorithm exists which achieves communicating a sample by simulating a large number of random variables Zn ∼ p and then identifying an index N∗ such that ZN∗ is distributed according to q, at least approximately in a total variation sense.

### The Quantum Reverse Shannon Theorem and Resource Tradeoffs for Simulating Quantum Channels

- Computer ScienceIEEE Transactions on Information Theory
- 2014

The amounts of communication and auxiliary resources needed in both the classical and quantum cases, the tradeoffs among them, and the loss of simulation efficiency when auxiliary resources are absent or insufficient are established.

### Algorithms for the Communication of Samples

- Computer ScienceICML
- 2022

This work introduces ordered random coding (ORC) which uses a simple trick to reduce the coding cost of previous approaches and describes a hybrid coding scheme which uses dithered quantization to more efﬁciently communicate samples from distributions with bounded support.

### Quantum channels and memory effects

- Computer Science
- 2014

The study of memory effects in quantum channels is a fertile ground where interesting novel phenomena emerge at the intersection of quantum information theory and other branches of physics.

### Quantum-to-classical rate distortion coding

- Computer ScienceArXiv
- 2012

A single-letter formula is derived for the minimum rate of classical communication needed for quantum-to-classical rate distortion coding in this setting, in which a sender Alice has many copies of a quantum information source.

### Simulation of a Channel With Another Channel

- Computer ScienceIEEE Transactions on Information Theory
- 2017

This paper fully characterize when a binary symmetric channel can be simulated from a binary erasure channel when there is no shared randomness, and introduces a notion of “channel diameter” which is shown to be additive and satisfy a data processing inequality.

## References

SHOWING 1-7 OF 7 REFERENCES

### Approximation Theory of Output Statistics

- Computer Science, MathematicsProceedings. IEEE International Symposium on Information Theory
- 1993

The notion of resolvability of a channel is introduced, defined as the number of random bits required per channel use in order to generate an input that achieves arbitrarily accurate approximation of the output statistics for any given input process, and a general formula is obtained which holds regardless of the channel memory structure.

### Entanglement-assisted capacity of a quantum channel and the reverse Shannon theorem

- Computer ScienceIEEE Trans. Inf. Theory
- 2002

In the classical analog of entanglement-assisted communication - communication over a discrete memoryless channel (DMC) between parties who share prior random information - one parameter is sufficient, i.e., that in the presence of prior shared random information, all DMCs of equal capacity can simulate one another with unit asymptotic efficiency.

### The common information of two dependent random variables

- Mathematics, Computer ScienceIEEE Trans. Inf. Theory
- 1975

The main result of the paper is contained in two theorems which show that C(X; Y) is i) the minimum R_0 such that a sequence of independent copies of (X,Y) can be efficiently encoded into three binary streams W_0, W_1,W_2 with rates R-0, R-1,R-2.

### Capacity of Coordinated Actions

- Computer Science2007 IEEE International Symposium on Information Theory
- 2007

This work proposes the problem of coordinating action over many nodes by distributed communication, and solves most 3-node problems but one remains open.

### Common Information is Far Less T han Mutual Information

- Problems of Control and Info. Theory , vol. 2, pp. 149-162, 1973.
- 1973

### Common Information is Far Less Than Mutual Information

- Problems of Control and Info. Theory, vol. 2, pp. 149-162, 1973.
- 1973