• Publications
  • Influence
Hidden Markov processes
TLDR
An overview of statistical and information-theoretic aspects of hidden Markov processes (HMPs) is presented and consistency and asymptotic normality of the maximum-likelihood parameter estimator were proved under some mild conditions.
Universal Prediction
TLDR
Both the probabilistic setting and the deterministic setting of the universal prediction problem are described with emphasis on the analogy and the differences between results in the two settings.
On information rates for mismatched decoders
TLDR
In contrast to the classical matched decoding case, here, under the mismatched decoding regime, the highest achievable rate depends on whether the performance criterion is the bit error rate or the message error probability and whether the coding strategy is deterministic or randomized.
Guessing Subject to Distortion
TLDR
This work investigates the problem of guessing a random vector X within distortion level D and proposes an asymptotically optimal guessing scheme that is universal both with respect to the information source and the value of /spl rho/.
On successive refinement for the Wyner-Ziv problem
TLDR
It is demonstrated that a source that is not successively refinable in the ordinary sense may become successivelyRefinement in the presence of SI at the decoders, in the Wyner-Ziv setting.
Universal prediction of individual sequences
TLDR
An efficient prediction procedure based on the incremental parsing procedure of the Lempel-Ziv data compression algorithm is shown to achieve asymptotically the finite-state predictability of an infinite sequence.
A competitive minimax approach to robust estimation of random parameters
TLDR
The minimax regret approach can improve the performance over both the minimax MSE approach and a "plug in" approach, in which the estimator is chosen to be equal to the MMSE estimator with an estimated covariance matrix replacing the true unknown covariance.
A strong version of the redundancy-capacity theorem of universal coding
TLDR
This work shows that the capacity of the channel induced by a given class of sources is essentially a lower bound also in a stronger sense, that is, for "most" sources in the class, and extends Rissanen's (1984, 1986) lower bound for parametric families.
Relations between entropy and error probability
TLDR
The authors present a channel coding theorem for the equivocation which, unlike the channel coding algorithm for error probability, is meaningful at all rates.
The Shannon cipher system with a guessing wiretapper
TLDR
Asymptotically optimal strategies for both encryption and guessing are demonstrated, which are universal in the sense of being independent of the statistics of the source.
...
1
2
3
4
5
...