# Coordination Using Individually Shared Randomness

@article{Kurri2018CoordinationUI, title={Coordination Using Individually Shared Randomness}, author={Gowtham R. Kurri and V. Prabhakaran and A. Sarwate}, journal={2018 IEEE International Symposium on Information Theory (ISIT)}, year={2018}, pages={2550-2554} }

Two processors output correlated sequences using the help of a coordinator with whom they individually share independent randomness. For the case of unlimited shared randomness, we characterize the rate of communication required from the coordinator to the processors over a broadcast link. We also give an achievable trade-off between the communication and shared randomness rates.

#### 3 Citations

Coordination via Shared Randomness

- Computer Science
- 2019 IEEE Information Theory Workshop (ITW)
- 2019

We study a distributed sampling problem where a set of processors want to output correlated sequences of random variables with the help of a coordinator which has access to several independent… Expand

Optimal Communication Rates and Combinatorial Properties for Distributed Simulation.

- Mathematics
- 2019

We study the distributed simulation problem where $n$ players aim to generate \emph{same} sequences of random coin flips where some subsets of the players share an independent common coin which can… Expand

Optimal Communication Rates for Zero-Error Distributed Simulation under Blackboard Communication Protocols

- Computer Science, Mathematics
- ArXiv
- 2019

If the size-$k$ subsets with common coins contain a path-connected cluster of topologically connected components, this work proposes a communication scheme which achieves the optimal rate of communication. Expand

#### References

SHOWING 1-10 OF 12 REFERENCES

Coordination Capacity

- Mathematics, Computer Science
- IEEE Transactions on Information Theory
- 2010

This work asks what dependence can be established among the nodes of a communication network given the communication constraints, and develops elements of a theory of cooperation and coordination in networks. Expand

Assisted sampling of correlated sources

- Computer Science
- 2013 IEEE International Symposium on Information Theory
- 2013

This work studies a distributed sampling scenario in which two agents observing components of a correlated source must each generate component of a second correlated source to generate correlation in their outputs. Expand

Channel Simulation via Interactive Communications

- Computer Science, Mathematics
- IEEE Transactions on Information Theory
- 2015

The problem of channel simulation via interactive communication, known as the coordination capacity, in a two-terminal network is studied by providing an exact computable characterization of the multiround problem by employing the technique of output statistics of random binning that has been recently developed by the authors. Expand

Source coding for a simple network

- Computer Science
- 1974

This work considers the problem of source coding subject to a fidelity criterion for a simple network connecting a single source with two receivers via a common channel and two private channels and develops several upper and lower bounds that actually yield a portion of the desired region. Expand

Distributed Channel Synthesis

- Mathematics, Computer Science
- IEEE Transactions on Information Theory
- 2013

This paper characterizes the optimal tradeoff between the amount of common randomness used and the required rate of description and generalizes and strengthens a soft covering lemma, known in the literature for its role in quantifying the resolvability of a channel. Expand

Network Information Theory

- Computer Science
- 2001

A system with many senders and receivers contains many new elements in the communication problem: interference, cooperation and feedback. These are the issues that are the domain of network… Expand

Achievability Proof via Output Statistics of Random Binning

- Computer Science, Mathematics
- IEEE Transactions on Information Theory
- 2014

A new and ubiquitous framework for establishing achievability results in network information theory problems that uses random binning arguments and is based on a duality between channel and source coding problems, which allows for proving coordination and strong secrecy problems. Expand

Entanglement-assisted capacity of a quantum channel and the reverse Shannon theorem

- Mathematics, Physics
- IEEE Trans. Inf. Theory
- 2002

In the classical analog of entanglement-assisted communication - communication over a discrete memoryless channel (DMC) between parties who share prior random information - one parameter is sufficient, i.e., that in the presence of prior shared random information, all DMCs of equal capacity can simulate one another with unit asymptotic efficiency. Expand

Generating dependent random variables over networks

- Mathematics, Computer Science
- 2011 IEEE Information Theory Workshop
- 2011

New inner and outer bounds on the achievable rates for networks with two nodes are proved. Expand

The common information of two dependent random variables

- Mathematics, Computer Science
- IEEE Trans. Inf. Theory
- 1975

The main result of the paper is contained in two theorems which show that C(X; Y) is i) the minimum R_0 such that a sequence of independent copies of (X,Y) can be efficiently encoded into three binary streams W_0, W_1,W_2 with rates R-0, R-1,R-2. Expand