#### Filter Results:

#### Publication Year

1975

2016

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

Edited by András Antos [We handle only discrete random variables and joint distributions of finitely many of them.] Appendix In this appendix we summarize some basic definitions and results from the theory of probability. Most proofs are omitted as they may be found in standard textbooks on probability, such as Feller [1], Ash [2], Shiryayev [3], Chow and… (More)

- László Györfi, Michael Kohler, Adam Krzyzak, Harro Walk
- Springer series in statistics
- 2002

- András Antos, László Györfi, András György
- IEEE Transactions on Information Theory
- 2005

We consider the rate of convergence of the expected distortion redundancy of empirically optimal vector quantizers. Earlier results show that the mean-squared distortion of an empirically optimal quantizer designed from n independent and identically distributed (i.i.d.) source samples converges uniformly to the optimum at a rate of O(1//spl radic/n), and… (More)

- Andrew R. Barron, László Györfi, Edward C. van der Meulen
- IEEE Trans. Information Theory
- 1992

The problem of the nonparametric estimation of a probability distribution is considered from three viewpoints: the consistency in total variation, the consistency in information divergence, and consistency in reversed order information divergence. These types of consistencies are relatively strong criteria of convergence, and a probability distribution… (More)

- L Devroye, L Györfi
- 2007

The classical nonparametric example is the problem estimating a distribution function F´xµ from i.i.d. samples X 1 X 2 X n taking values in R d ´d 1µ. Here on the one hand the construction of the empirical distribution function F n ´xµ is distribution-free, and on the other hand its uniform convergence, the Glivenko-Cantelli Theorem holds for all F´xµ: lim… (More)

- Nguyen Q. A, László Györfi, James L. Massey
- IEEE Trans. Information Theory
- 1992

A general theorem is proved showing how to oh-tain a constant-weight binary cyclic code from a p-ary linear cyclic code, where p is a prime,. by using a representation of Cl;(p) as cyclic shifts of a binary p-tuple. Based on this theorem, constructions are given for four classes of binary constant-weight codes. The first two classes are shown to achieve the… (More)

- László Györfi, Gábor Lugosi, Gusztáv Morvai
- IEEE Trans. Information Theory
- 1999

We present a simple randomized procedure for the prediction of a binary sequence. The algorithm uses ideas from recent developments of the theory of the prediction of individual sequences. We show that if the sequence is a realization of a stationary and ergodic random process then the average number of mistakes converges, almost surely, to that of the… (More)

In this paper the sequential prediction problem with expert advice is considered when the loss is unbounded under partial monitoring scenarios. We deal with a wide class of the partial monitoring problems: the combination of the label efficient and multi-armed bandit problem, that is, where the algorithm is only informed about the performance of the chosen… (More)

- László Györfi, Frederic Udina, Harro Walk, Ramon Trias Fargas, Michael Greenacre
- 2003

In recent years optimal portfolio selection strategies for sequential investment have been shown to exist. Although their asymptotical optimality is well established, finite sample properties do need the adjustment of parameters that depend on dimensionality and scale. In this paper we introduce some nearest neighbor based portfolio selectors that solve… (More)

- András Antos, Luc Devroye, László Györfi
- IEEE Trans. Pattern Anal. Mach. Intell.
- 1999