• Corpus ID: 7150064

Subset Typicality Lemmas and Improved Achievable Regions in Multiterminal Source Coding

  title={Subset Typicality Lemmas and Improved Achievable Regions in Multiterminal Source Coding},
  author={Kumar Viswanatha and Emrah Akyol and Kenneth M. Rose},
Consider the following information theoretic setup wherein independent codebooks of N correlated random variables are generated according to their respective marginals. The problem of determining the conditions on the rates of codebooks to ensure the existence of at least one codeword tuple which is jointly typical with respect to a given joint density (called the multivariate covering lemma) has been studied fairly well and the associated rate regions have found applications in several source… 

Figures and Tables from this paper

A new achievable region for Gaussian multiple descriptions based on subset typicality

This paper addresses the L-channel multiple descriptions problem for a Gaussian source under mean squared error (MSE) distortion metric and derives a new encoding scheme and an associated rate-distortion region wherein joint typicality of codewords only within the prescribed subsets is maintained.

On a Generalised Typicality and Its Applications in Information Theory

This work will show that the conditional typicality lemma can be obtained for a generic typicality, and define a multivariate typicality for general alphabets and general probability measures on product spaces, based on the relative entropy, which can be a measure of the relevance between multiple sources.

Broadcast caching networks with two receivers and multiple correlated sources

It is shown that for symmetric sources the two-step strategy achieves the lower bound for large cache capacities, and it is within half of the joint entropy of two of the sources conditioned on the third source for all other cache sizes.

Rate-Memory Trade-Off for Caching and Delivery of Correlated Sources

This paper studies the fundamental limits of content delivery in a cache-aided broadcast network for correlated content generated by a discrete memoryless source with arbitrary joint distribution and proposes an achievable correlation-aware schemes based on a two-step source coding approach.



Combinatorial Message Sharing for a refined multiple descriptions achievable region

An achievable rate-distortion region is derived for the proposed novel encoding technique involving ‘Combinatorial Message Sharing’, where every subset of the descriptions may share a distinct common message.

n-channel symmetric multiple descriptions-part II:An achievable rate-distortion region

In this Part II of a two-part paper, we present a new achievable rate-distortion region for the symmetric n-channel multiple-descriptions coding problem (n>2) where the rate of every description is

An achievable rate region for distributed source coding and dispersive information routing

This paper considers the optimum encoding scheme when every source can (possibly) communicate with every sink irrespective of what the sinks reconstruct and derives an achievable rate region and an associated achievable cost using principles from distributed source coding and multiple descriptions encoding.

On the Role of Encoder Side-Information in Source Coding for Multiple Decoders

The rate-distortion region of the Gaussian version of a problem previously solved by Kaspi for discrete memoryless sources is obtained and it is quantified how much revealing the side-information to the encoder helps in such a Gaussian setup.

Multiple description coding with many channels

An achievable region for the L-channel multiple description coding problem is presented and a new outer bound on the rate-distortion (RD) region for memoryless Gaussian sources with mean squared error distortion is derived.

Multi-user privacy: The Gray-Wyner system and generalized common information

The problem of preserving privacy when a multi-variate source is required to be revealed partially to multiple users is modeled as a Gray-Wyner source coding problem with K correlated sources at the

Source coding for a simple network

This work considers the problem of source coding subject to a fidelity criterion for a simple network connecting a single source with two receivers via a common channel and two private channels and develops several upper and lower bounds that actually yield a portion of the desired region.

Correlated source coding for fusion storage and selective retrieval

This work defines the problem of shared descriptions (SD) source coding and relates it to the storage and retrieval problem, and presents an achievable rate region for the SD problem and uses it to characterize the storage vs. retrieval tradeoff.

Achievable rates for multiple descriptions

These rates are shown to be optimal for deterministic distortion measures for random variables and Shannon mutual information.

Elements of Information Theory

The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.