#### Filter Results:

#### Publication Year

1980

2017

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

—An overview of statistical and information-theoretic aspects of hidden Markov processes (HMPs) is presented. An HMP is a discrete-time finite-state homogeneous Markov chain observed through a discrete-time memoryless invariant channel. In recent years, the work of Baum and Petrie on finite-state finite-alphabet HMPs was expanded to HMPs with finite as well… (More)

Abstruct-The problem of predicting the next outcome of an individual binary sequence using finite memory, is considered. The finite-state predictability of an infinite sequence is defined as the minimum fraction of prediction errors that can be made by any finite-state (FS) predictor. It is proved that this FS predictability can be attained by universal… (More)

This paper consists of an overview on universal prediction from an information-theoretic perspective. Special attention is given to the notion of probability assignment under the self-information loss function, which is directly related to the theory of universal data compression. Both the probabilistic setting and the deterministic setting of the universal… (More)

Reliable transmission over a discrete-time memory-less channel with a decoding metric that is not necessarily matched to the channel (mismatched decoding) is considered. It is assumed that the encoder knows both the true channel and the decoding metric. The lower bound on the highest achievable rate found by Csiszar and Komer and by Hui for DMC's, hereafter… (More)

The problem of optimal sequential decision for individual sequences, relative to a class of competing oo-line reference strategies, is studied for general loss functions with memory. This problem is motivated by applications in which actions may h a ve \long term" eects, or there is a cost for switching from one action to another. As a rst step, we consider… (More)

—We consider the problem of estimating, in the presence of model uncertainties, a random vector x that is observed through a linear transformation H and corrupted by additive noise. We first assume that both the covariance matrix of x and the transformation H are not completely specified and develop the linear estimator that minimizes the worst-case… (More)

— We investigate the problem of guessing a random vector X X X within distortion level D. Our aim is to characterize the best attainable performance in the sense of minimizing, in some probabilistic sense, the number of required guesses G(X X X) until the error falls below D. The underlying motivation is that G(X X X) is the number of candidate codewords to… (More)