Wyner-Ziv theory for a general function of the correlated sources

@article{Yamamoto1982WynerZivTF,
  title={Wyner-Ziv theory for a general function of the correlated sources},
  author={Hirosuke Yamamoto},
  journal={IEEE Trans. Inf. Theory},
  year={1982},
  volume={28},
  pages={803-807}
}
A source coding problem is considered for the Wyner-Ziv type system where the decoder is required to estimate the value of some function of the encoder input and the side information. The rate-distortion function is established for this system, and for some binary cases parametric expressions are obtained to enable numerical calculations. 

Figures and Tables from this paper

Computing a function of correlated Sources: A rate region

A receiver wants to compute a function f of two correlated sources X and Y and side information Z. What is the minimum number of bits that needs to be communicated by each transmitter? In this paper,

An achievable rate region for distributed source coding with reconstruction of an arbitrary function of the sources

A new rate region is presented for a general framework of distributed source coding where the decoder is interested in lossy reconstruction of an arbitrary function of the sources. The coding scheme

Lossy Computing with Side Information via Multi-Hypergraphs

TLDR
It is shown that the rate-distortion function can be characterized through a characteristic multi-hypergraph, which simplifies the evaluation of the Rate-Distortion function.

Distributed Joint Source-Channel Coding for Functions over a Multiple Access Channel

  • R. RajeshV. Sharma
  • Computer Science
    GLOBECOM 2009 - 2009 IEEE Global Telecommunications Conference
  • 2009
TLDR
The conditions obtained can be shown as generalized version of Yamamoto's result [28] and efficient joint source-channel coding schemes for transmission of discrete and continuous alphabet sources to recover the function values are obtained.

Algebraic structures for multi-terminal communication systems

TLDR
A coding scheme involving nested lattice codes that reconstructs the linear function by encoding in such a fashion that the decoder is able to reconstruct the function directly, and presents a new achievable rate-distortion region for this problem based on “good” structured nested random codes built over abelian groups.

Source Coding with a Side Information

TLDR
A setting is studied in which the encoder is potentially uncertain about the delay with which measurements of the side information are acquired at the decoder, and a single-letter characterization of the rate-distortion region is given.

Networked Source Coding with Covariance Distortion Constraints

TLDR
It is shown that one could design the distortion matrices at the nodes in order to maximize the output SNR at the fusion center, and thereby bridge between enhancement and source coding within this setup.

Source Coding in Networks With Covariance Distortion Constraints

TLDR
A notion of minimum for two positive-definite matrices is defined based on which an explicit formula for the rate-distortion function is derived and it is shown that one can design the distortion matrices at the nodes in order to maximize the output SNR at the fusion center.

Functional Source Coding for Networks with Receiver Side Information ∗

TLDR
The functional rate-distortion function describes the optimal trade-off between rate and distortion in the given coding framework and is generalized to networks with noise-corrupted observations of the source and side information.

Lossy Computing of Correlated Sources with Fractional Sampling

TLDR
This paper considers the problem of lossy compression for the computation of a function of two correlated sources, both of which are observed at the encoder, and shows the optimal measurement overlap fraction to depend on the function to be computed by the decoder, on the source statistics, including the correlation, and on the link rate.
...

References

SHOWING 1-10 OF 13 REFERENCES

Rate-distortion for correlated sources with partially separated encoders

TLDR
A general coding theorem is proved which established that a certain region defined in terms of "single-letter" information theoretic quantities, is an inner bound to the region of all attainable vectors of rates and distortions.

How to encode the modulo-two sum of binary sources (Corresp.)

TLDR
If the sequences are the outputs of two correlated memoryless binary sources, then in some cases the rate of this information may be substantially less than the joint entropy of the two sources.

To get a bit of information may be as hard as to get full information

TLDR
The following coding problem for correlated discrete memoryless sources is considered and it is proven that these rates are often as large as those needed for a full reproduction of the outputs of both sources.

A survey of multi-way channels in information theory: 1961-1976

  • E. Meulen
  • Computer Science
    IEEE Trans. Inf. Theory
  • 1977
TLDR
The advances in the area of multi-way channels during the period 1961-1976 are described and Shannon's two-way channel, the multiple-access channel, and the interference channel are treated successively.

Noiseless coding of correlated information sources

TLDR
The minimum number of bits per character R_X and R_Y needed to encode these sequences so that they can be faithfully reproduced under a variety of assumptions regarding the encoders and decoders is determined.

The rate-distortion function for source coding with side information at the decoder

TLDR
The quantity R \ast (d) is determined, defined as the infimum ofrates R such that communication is possible in the above setting at an average distortion level not exceeding d + \varepsilon .

20064 and the U.S

  • Naval Research Laboratory,
  • 1973