• Corpus ID: 741979

# Maximum Entropy Functions: Approximate Gacs-Korner for Distributed Compression

@article{Salamatian2016MaximumEF,
title={Maximum Entropy Functions: Approximate Gacs-Korner for Distributed Compression},
author={Salman Salamatian and Asaf Cohen and Muriel M{\'e}dard},
journal={ArXiv},
year={2016},
volume={abs/1604.03877}
}
• Published 13 April 2016
• Computer Science, Mathematics
• ArXiv
Consider two correlated sources X and Y generated from a joint distribution pX,Y . Their Gacs-Korner Common Information, a measure of common information that exploits the combinatorial structure of the distribution pX,Y , leads to a source decomposition that exhibits the latent common parts in X and Y . Using this source decomposition we construct an efficient distributed compression scheme, which can be efficiently used in the network setting as well. Then, we relax the combinatorial…
7 Citations

## Figures from this paper

We design a low complexity distributed compression scheme for computing arbitrary functions of sources with discrete alphabets. We use a helper-based method that extends the definition of the
• Computer Science
2016 International Symposium on Information Theory and Its Applications (ISITA)
• 2016
This work introduces a novel notion of separation of source and network coding using Gács-Körner Common Information (CI), where the sufficient condition for this separation to hold depends on the source structure rather than the network topology.
• Computer Science
ArXiv
• 2022
The notion of common information is a variational relaxation of the G ´ acs-K ¨ orner common information, which the authors recover as a special case, but is more amenable to optimization and can be approximated empirically using samples from the underlying distribution.
• Jonathan Ponniah
• Mathematics
2022 58th Annual Allerton Conference on Communication, Control, and Computing (Allerton)
• 2022
The work of Witsenhausen explores conditions under which two non-interactive users observing different coordinates of an i.i.d random process, can reach asymptotic agreement. Witsenhausen considers
• Computer Science
Entropy
• 2019
A surprising connection is revealed between third-order connected information, two-way secret key agreement rate, and synergy, and the use of a consistent PID quantified using a secretKey agreement rate naturally induces a directional interpretation of the PID.
• Computer Science
IEEE Transactions on Information Forensics and Security
• 2020
This work considers a non-stochastic privacy-preserving problem in which an adversary aims to infer sensitive information from publicly accessible data without using statistics, and proposes a corresponding agglomerative clustering algorithm that converges to a locally optimal quantization solution.

## References

SHOWING 1-10 OF 12 REFERENCES

• Computer Science
2016 International Symposium on Information Theory and Its Applications (ISITA)
• 2016
This work introduces a novel notion of separation of source and network coding using Gács-Körner Common Information (CI), where the sufficient condition for this separation to hold depends on the source structure rather than the network topology.
• Computer Science
IEEE Trans. Inf. Theory
• 1973
The minimum number of bits per character R_X and R_Y needed to encode these sequences so that they can be faithfully reproduced under a variety of assumptions regarding the encoders and decoders is determined.
• Computer Science
IEEE Trans. Inf. Theory
• 2003
The minimum zero-error asymptotic rate of transmission is shown to be the complementary graph entropy of an associated graph, and upper and lower bounds for this minimum rate are provided.
• Computer Science
IEEE Transactions on Information Theory
• 2014
The region of tension developed in this paper measures how well the dependence between a pair of random variables can be resolved by a piece of common information.
• Mathematics
2014 IEEE Information Theory Workshop (ITW 2014)
• 2014
It is shown that, under certain symmetry conditions, the principal inertia components play an important role in estimating one-bit functions of X, namely f(X), given an observation of Y, which naturally leads to the conjecture that the mutual information between f( X) and Y is maximized when all the principal momentum components have equal value.
A discrete random variable X is to be transmitted by means of a discrete signal so that the probability of error must be exactly zero, and the problem is to minimize the signal's alphabet size.
• Computer Science
IEEE Transactions on Information Theory
• 2006
This correspondence considers the problem of distributed source coding of multiple sources over a network with multiple receivers and shows that the problem with two sources and two receivers is always separable.
The generalized random variables $( {x,y} )$ have a given joint distribution. Pairs $( {x_i ,y_i } )$ are drawn independently. The observer of $( {x_1 , \cdots ,x_n } )$ and the observer of \$( {y_1 ,
In this chapter we deal with the following nonlinear fractional programming problem: $$P:\mathop{{\max }}\limits_{{x \in s}} q(x) = (f(x) + \alpha )/((x) + \beta )$$ where f, g: R n → R, α, β ∈
• D. Spielman
• Mathematics
48th Annual IEEE Symposium on Foundations of Computer Science (FOCS'07)
• 2007
This tutorial will try to provide some intuition as to why these eigenvectors and eigenvalues have combinatorial significance, and will sitn'ey some of their applications.