Learn More
An overview of statistical and information-theoretic aspects of hidden Markov processes (HMPs) is presented. An HMP is a discrete-time finite-state homogeneous Markov chain observed through a discrete-time memoryless invariant channel. In recent years, the work of Baum and Petrie on finite-state finite-alphabet HMPs was expanded to HMPs with finite as well(More)
Achievable rates are characterized for successive refinement (SR) in the Wyner-Ziv scenario, namely, in the presence of correlated side information (SI) at the receivers. In this setting, the encoder is assumed to operate in two stages, where the first corresponds to relatively low rate and high distortion, and the second, comprising a refinement code on(More)
Reliable transmission over a discrete-time memoryless channel with a decoding metric that is not necessarily matched to the channel (mismatched decoding) is considered. I t is assumed that the encoder knows both the true channel and the decoding metric. The lower bound on the highest achievable rate found by Csiszar and Komer and by Hui for DMC’s, hereafter(More)
Abstruct-The problem of predicting the next outcome of an individual binary sequence using finite memory, is considered. The finite-state predictability of an infinite sequence is defined as the minimum fraction of prediction errors that can be made by any finite-state (FS) predictor. It is proved that this FS predictability can be attained by universal(More)
This paper consists of an overview on universal prediction from an information-theoretic perspective. Special attention is given to the notion of probability assignment under the self-information loss function, which is directly related to the theory of universal data compression. Both the probabilistic setting and the deterministic setting of the universal(More)
We consider the problem of estimating, in the presence of model uncertainties, a random vector x that is observed through a linear transformation H and corrupted by additive noise. We first assume that both the covariance matrix of x and the transformation H are not completely specified and develop the linear estimator that minimizes the worst-case(More)
We investigate the problem of guessing a random vector X within distortion level D. Our aim is to characterize the best attainable performance in the sense of minimizing, in some probabilistic sense, the number of required guesses G(X) until the error falls below D. The underlying motivation is that G(X) is the number of candidate codewords to be examined(More)
[4] I. CsiszAr and J. Komer, Information Theory: Coding Theorems for Discrete Memoryless Systems. New York Academic, 198 1. [5] R. G. Gallager, Information Theory and Reliable Communication. New York: Wiley. 1968. [6] A. I. Viterbi and J. K. Omura, Principles of Digital Communication and Coding. New York McGraw-Hill, 1979. [7] W. H. R. Equitz and T. M.(More)
The Shannon theory of cipher systems is combined with recent work on guessing values of random variables. The security of encryption systems is measured in terms of moments of the number of guesses needed for the wiretapper to uncover the plaintext given the cryptogram. While the encrypter aims at maximizing the guessing eeort, the wiretapper strives to(More)