Asymptotically minimax regret by Bayes mixtures - Information Theory, 1998. Proceedings. 1998 IEEE International Symposium on

Abstract

We study the problem of data compression, gambling and prediction of a sequence zn = z1z2 ... z, from a certain alphabet X , in terms of regret [4] and redundancy with respect to a general exponential family, a general smooth family, and also Markov sources. In particular, we show that variants of Jeffreys mixture asymptotically achieve their minimax values. These results are generalizations of the work by Xie and Barron [5, 61 in the general smooth families. In particular for one-dimensional exponential families, they also extend the works of Clarke and Barron [I] to deal with the full natural parameter space rather than compact sets interior to it. The worst case regret of a probability density q with respect to a d-dimensional family of probability densities S = { p ( l 9 ) : 9 E 0) and a set of the sequences W, E X" is defined as

Cite this paper

@inproceedings{Takeuchi2004AsymptoticallyMR, title={Asymptotically minimax regret by Bayes mixtures - Information Theory, 1998. Proceedings. 1998 IEEE International Symposium on}, author={J. Takeuchi}, year={2004} }