• Corpus ID: 741979

Maximum Entropy Functions: Approximate Gacs-Korner for Distributed Compression

@article{Salamatian2016MaximumEF,
  title={Maximum Entropy Functions: Approximate Gacs-Korner for Distributed Compression},
  author={Salman Salamatian and Asaf Cohen and Muriel M{\'e}dard},
  journal={ArXiv},
  year={2016},
  volume={abs/1604.03877}
}
Consider two correlated sources X and Y generated from a joint distribution pX,Y . Their Gacs-Korner Common Information, a measure of common information that exploits the combinatorial structure of the distribution pX,Y , leads to a source decomposition that exhibits the latent common parts in X and Y . Using this source decomposition we construct an efficient distributed compression scheme, which can be efficiently used in the network setting as well. Then, we relax the combinatorial… 

Figures from this paper

Applications of Common Information to Computing Functions

We design a low complexity distributed compression scheme for computing arbitrary functions of sources with discrete alphabets. We use a helper-based method that extends the definition of the

Efficient coding for multi-source networks using Gács-Körner common information

This work introduces a novel notion of separation of source and network coding using Gács-Körner Common Information (CI), where the sufficient condition for this separation to hold depends on the source structure rather than the network topology.

Gacs-Korner Common Information Variational Autoencoder

The notion of common information is a variational relaxation of the G ´ acs-K ¨ orner common information, which the authors recover as a special case, but is more amenable to optimization and can be approximated empirically using samples from the underlying distribution.

On the Limits of Distributed Agreement between Correlated Sources

  • Jonathan Ponniah
  • Mathematics
    2022 58th Annual Allerton Conference on Communication, Control, and Computing (Allerton)
  • 2022
The work of Witsenhausen explores conditions under which two non-interactive users observing different coordinates of an i.i.d random process, can reach asymptotic agreement. Witsenhausen considers

Unique Information and Secret Key Agreement

A surprising connection is revealed between third-order connected information, two-way secret key agreement rate, and synergy, and the use of a consistent PID quantified using a secretKey agreement rate naturally induces a directional interpretation of the PID.

Developing Non-Stochastic Privacy-Preserving Policies Using Agglomerative Clustering

  • Ni DingF. Farokhi
  • Computer Science
    IEEE Transactions on Information Forensics and Security
  • 2020
This work considers a non-stochastic privacy-preserving problem in which an adversary aims to infer sensitive information from publicly accessible data without using statistics, and proposes a corresponding agglomerative clustering algorithm that converges to a locally optimal quantization solution.

References

SHOWING 1-10 OF 12 REFERENCES

Efficient coding for multi-source networks using Gács-Körner common information

This work introduces a novel notion of separation of source and network coding using Gács-Körner Common Information (CI), where the sufficient condition for this separation to hold depends on the source structure rather than the network topology.

Noiseless coding of correlated information sources

The minimum number of bits per character R_X and R_Y needed to encode these sequences so that they can be faithfully reproduced under a variety of assumptions regarding the encoders and decoders is determined.

On zero-error source coding with decoder side information

The minimum zero-error asymptotic rate of transmission is shown to be the complementary graph entropy of an associated graph, and upper and lower bounds for this minimum rate are provided.

Assisted Common Information With an Application to Secure Two-Party Sampling

The region of tension developed in this paper measures how well the dependence between a pair of random variables can be resolved by a piece of common information.

An exploration of the role of principal inertia components in information theory

It is shown that, under certain symmetry conditions, the principal inertia components play an important role in estimating one-bit functions of X, namely f(X), given an observation of Y, which naturally leads to the conjecture that the mutual information between f( X) and Y is maximized when all the principal momentum components have equal value.

The zero-error side information problem and chromatic numbers (Corresp.)

A discrete random variable X is to be transmitted by means of a discrete signal so that the probability of error must be exactly zero, and the problem is to minimize the signal's alphabet size.

Separating distributed source coding from network coding

This correspondence considers the problem of distributed source coding of multiple sources over a network with multiple receivers and shows that the problem with two sources and two receivers is always separable.

ON SEQUENCES OF PAIRS OF DEPENDENT RANDOM VARIABLES

The generalized random variables $( {x,y} )$ have a given joint distribution. Pairs $( {x_i ,y_i } )$ are drawn independently. The observer of $( {x_1 , \cdots ,x_n } )$ and the observer of $( {y_1 ,

Nonlinear Fractional Programming

In this chapter we deal with the following nonlinear fractional programming problem: $$P:\mathop{{\max }}\limits_{{x \in s}} q(x) = (f(x) + \alpha )/((x) + \beta )$$ where f, g: R n → R, α, β ∈

Spectral Graph Theory and its Applications

  • D. Spielman
  • Mathematics
    48th Annual IEEE Symposium on Foundations of Computer Science (FOCS'07)
  • 2007
This tutorial will try to provide some intuition as to why these eigenvectors and eigenvalues have combinatorial significance, and will sitn'ey some of their applications.