Learn More
We study error exponents for source coding with side information. Both achievable exponents and converse bounds are obtained for the following two cases: lossless source coding with coded information (SCCSI) and lossy source coding with full side information (Wyner-Ziv). These results recover and extend several existing results on source-coding error(More)
—Given training sequences generated by two distinct, but unknown distributions sharing a common alphabet, we seek a classifier that can correctly decide whether a third test sequence is generated by the first or second distribution using only the training data. To model 'limited learning' we allow the alphabet size to grow and therefore probability(More)
We provide a novel upper-bound on Witsenhausen's rate, the rate required in the zero-error analogue of the Slepian-Wolf problem; our bound is given in terms of a new information-theoretic functional defined on a certain graph. We then use the functional to give a single letter lower-bound on the error exponent for the Slepian-Wolf problem under the(More)
Given training sequences generated by two distinct, but unknown, distributions sharing a common alphabet, we study the problem of determining whether a third test sequence was generated according to the first or second distribution using only the training data. To better model sources such as natural language, for which the underlying distributions are(More)
  • 1