Coordination Using Individually Shared Randomness

@article{Kurri2018CoordinationUI,
  title={Coordination Using Individually Shared Randomness},
  author={Gowtham R. Kurri and V. Prabhakaran and A. Sarwate},
  journal={2018 IEEE International Symposium on Information Theory (ISIT)},
  year={2018},
  pages={2550-2554}
}
Two processors output correlated sequences using the help of a coordinator with whom they individually share independent randomness. For the case of unlimited shared randomness, we characterize the rate of communication required from the coordinator to the processors over a broadcast link. We also give an achievable trade-off between the communication and shared randomness rates. 
Coordination via Shared Randomness
We study a distributed sampling problem where a set of processors want to output correlated sequences of random variables with the help of a coordinator which has access to several independentExpand
Optimal Communication Rates and Combinatorial Properties for Distributed Simulation.
We study the distributed simulation problem where $n$ players aim to generate \emph{same} sequences of random coin flips where some subsets of the players share an independent common coin which canExpand
Optimal Communication Rates for Zero-Error Distributed Simulation under Blackboard Communication Protocols
TLDR
If the size-$k$ subsets with common coins contain a path-connected cluster of topologically connected components, this work proposes a communication scheme which achieves the optimal rate of communication. Expand

References

SHOWING 1-10 OF 12 REFERENCES
Coordination Capacity
TLDR
This work asks what dependence can be established among the nodes of a communication network given the communication constraints, and develops elements of a theory of cooperation and coordination in networks. Expand
Assisted sampling of correlated sources
TLDR
This work studies a distributed sampling scenario in which two agents observing components of a correlated source must each generate component of a second correlated source to generate correlation in their outputs. Expand
Channel Simulation via Interactive Communications
TLDR
The problem of channel simulation via interactive communication, known as the coordination capacity, in a two-terminal network is studied by providing an exact computable characterization of the multiround problem by employing the technique of output statistics of random binning that has been recently developed by the authors. Expand
Source coding for a simple network
TLDR
This work considers the problem of source coding subject to a fidelity criterion for a simple network connecting a single source with two receivers via a common channel and two private channels and develops several upper and lower bounds that actually yield a portion of the desired region. Expand
Distributed Channel Synthesis
  • P. Cuff
  • Mathematics, Computer Science
  • IEEE Transactions on Information Theory
  • 2013
TLDR
This paper characterizes the optimal tradeoff between the amount of common randomness used and the required rate of description and generalizes and strengthens a soft covering lemma, known in the literature for its role in quantifying the resolvability of a channel. Expand
Network Information Theory
A system with many senders and receivers contains many new elements in the communication problem: interference, cooperation and feedback. These are the issues that are the domain of networkExpand
Achievability Proof via Output Statistics of Random Binning
TLDR
A new and ubiquitous framework for establishing achievability results in network information theory problems that uses random binning arguments and is based on a duality between channel and source coding problems, which allows for proving coordination and strong secrecy problems. Expand
Entanglement-assisted capacity of a quantum channel and the reverse Shannon theorem
TLDR
In the classical analog of entanglement-assisted communication - communication over a discrete memoryless channel (DMC) between parties who share prior random information - one parameter is sufficient, i.e., that in the presence of prior shared random information, all DMCs of equal capacity can simulate one another with unit asymptotic efficiency. Expand
Generating dependent random variables over networks
TLDR
New inner and outer bounds on the achievable rates for networks with two nodes are proved. Expand
The common information of two dependent random variables
  • A. Wyner
  • Mathematics, Computer Science
  • IEEE Trans. Inf. Theory
  • 1975
TLDR
The main result of the paper is contained in two theorems which show that C(X; Y) is i) the minimum R_0 such that a sequence of independent copies of (X,Y) can be efficiently encoded into three binary streams W_0, W_1,W_2 with rates R-0, R-1,R-2. Expand
...
1
2
...