Joint Rate Distortion Function of a Tuple of Correlated Multivariate Gaussian Sources with Individual Fidelity Criteria

@article{Stylianou2021JointRD,
  title={Joint Rate Distortion Function of a Tuple of Correlated Multivariate Gaussian Sources with Individual Fidelity Criteria},
  author={Evagoras Stylianou and Charalambos D. Charalambous and Themistoklis Charalambous},
  journal={2021 IEEE International Symposium on Information Theory (ISIT)},
  year={2021},
  pages={2167-2172}
}
In this paper we analyze the joint rate distortion function (RDF), for a tuple of correlated sources taking values in abstract alphabet spaces (i.e., continuous) subject to two individual distortion criteria. First, we derive structural properties of the realizations of the reproduction Random Variables (RVs), which induce the corresponding optimal test channel distributions of the joint RDF. Second, we consider a tuple of correlated multivariate jointly Gaussian RVs, $X_{1}:\Omega\rightarrow… 

Figures from this paper

Joint Nonanticipative Rate Distortion Function for a Tuple of Random Processes with Individual Fidelity Criteria
The joint nonanticipative rate distortion function (NRDF) for a tuple of random processes with individual fidelity criteria is considered. Structural properties of optimal test channel distributions
Characterization of the Gray-Wyner Rate Region for Multivariate Gaussian Sources: Optimality of Gaussian Auxiliary RV
TLDR
The paper includes the characterization of the Pangloss plane of the Gray-Wyner rate region along with the characterizations of the corresponding rate distortion functions, their test-channel distributions, and structural properties of the realizations which induce these distributions.
A Rate Distortion Approach to Goal-Oriented Communication
TLDR
A variant of a robust description source coding framework motivated by goal-oriented semantic information transmission is studied here and a general result is proved that provides in parametric form the various cases of optimal solutions of this problem.

References

SHOWING 1-10 OF 12 REFERENCES
Characterization of Conditional Independence and Weak Realizations of Multivariate Gaussian Random Variables: Applications to Networks
TLDR
The Gray and Wyner source coding problem for joint decoding with mean-square error distortions and the methods are of fundamental importance to other problems of multi-user communication, where conditional independence is imposed as a constraint.
A New Approach to Lossy Network Compression of a Tuple of Correlated Multivariate Gaussian RVs
The classical Gray and Wyner source coding for a simple network for sources that generate a tuple of multivariate, correlated Gaussian random variables $(Y_1,Y_2)$ is re-examined using the geometric
The Lossy Common Information of Correlated Sources
TLDR
This paper deriving a single letter characterization for the lossy generalization of Wyner's CI, defined as the minimum rate on the shared branch of the Gray-Wyner network, maintaining minimum sum transmit rate when the two decoders reconstruct the sources subject to individual distortion constraints, and shows that this tradeoff provides a unified framework to understand the different notions of CI.
The common information of two dependent random variables
  • A. Wyner
  • Mathematics, Computer Science
    IEEE Trans. Inf. Theory
  • 1975
TLDR
The main result of the paper is contained in two theorems which show that C(X; Y) is i) the minimum R_0 such that a sequence of independent copies of (X,Y) can be efficiently encoded into three binary streams W_0, W_1,W_2 with rates R-0, R-1,R-2.
Sending a Bivariate Gaussian Over a Gaussian MAC
TLDR
The power-versus-distortion tradeoff for the distributed transmission of a memoryless bivariate Gaussian source over a two-to-one average-power limited Gaussian multiple-access channel is studied and an uncoded transmission scheme is introduced which is asymptotically optimal as the SNR tends to infinity.
A new class of lower bounds to information rates of stationary sources via conditional rate-distortion functions
  • R. Gray
  • Computer Science
    IEEE Trans. Inf. Theory
  • 1973
A new class of lower bounds to rate-distortion functions of stationary processes with memory and single-letter vector-valued distortion measures is derived. This class is shown to include or imply
A Lossy Source Coding Interpretation of Wyner’s Common Information
TLDR
It is established that, under suitable conditions, Wyner's common information equals to the smallest common message rate when the total rate is arbitrarily close to the rate distortion function with joint decoding for the Gray-Wyner network.
Structural Properties of Nonanticipatory Epsilon Entropy of Multivariate Gaussian Sources
TLDR
The complete characterization of the Gorbunov and Pinsker nonanticipatory epsilon entropy of multivariate Gauss-Markov sources with square-error fidelity is derived and the optimal matrices of the stochastic realization of the optimal test channel or reproduction distribution, admit spectral representations with respect to the same unitary matrices.
Source coding for a simple network
TLDR
This work considers the problem of source coding subject to a fidelity criterion for a simple network connecting a single source with two receivers via a common channel and two private channels and develops several upper and lower bounds that actually yield a portion of the desired region.
Generalizations of Nonanticipative Rate Distortion Function to Multivariate Nonstationary Gaussian Autoregressive Processes
TLDR
It is shown that the optimal reproduction distributions are induced by a reproduction process, which is a linear function of the state of the source, its best mean-square error estimate, and a Gaussian random process.
...
...