#### Filter Results:

- Full text PDF available (33)

#### Publication Year

1974

2017

- This year (0)
- Last 5 years (11)
- Last 10 years (27)

#### Publication Type

#### Co-author

#### Journals and Conferences

Learn More

- László Györfi, Michael Kohler, Adam Krzyzak, Harro Walk
- Springer series in statistics
- 2002

Find loads of the a distribution free theory of nonparametric regression book catalogues in this site as the choice of you visiting this page. You can also join to the website book library that will… (More)

1 Basics of Measure Theory Definition 1 Let S be a set, and let F be the family of all subsets of S. Then (S,F) is called a measurable space. The subsets of S are called measurable sets. Definition 2… (More)

We assume that H(f) is well-defined and is finite. The concept of differential entropy was introduced in Shannon’s original paper ([55]). Since then, entropy has been of great theoretical and applied… (More)

- A Q. NguyenQ., László Györfi, James L. Massey
- IEEE Trans. Information Theory
- 1992

A general theorem is proved showing how to ohtain a constant-weight binary cyclic code from a p-ary linear cyclic code, where p is a prime,. by using a representation of Cl;(p) as cyclic shifts of a… (More)

- Andrew R. Barron, László Györfi, Edward C. van der Meulen
- IEEE Trans. Information Theory
- 1992

The problem of the nonparametric estimation of a probability distribution is considered from three viewpoints: the consistency in total variation, the consistency in information divergence, and… (More)

m(x) = Wni(x ; X1 , . . . , Xn)Yi, i 1 and Wni(x; X1 , . . . ,Xn) is 1/k ifXi is one of the k nearest neighbors of x among X1, . . . , Xn , and Wn i is zero otherwise . Note in particular that I' 1… (More)

In this chapter we consider the prediction of stationary time series for various loss functions: squared loss (as it arises in the regression problem), 0− 1 loss (pattern recognition) and log utility… (More)

- András Antos, László Györfi, András György
- IEEE Transactions on Information Theory
- 2005

We consider the rate of convergence of the expected distortion redundancy of empirically optimal vector quantizers. Earlier results show that the mean-squared distortion of an empirically optimal… (More)

- Gérard Biau, László Györfi
- IEEE Transactions on Information Theory
- 2005

We present two simple and explicit procedures for testing homogeneity of two independent multivariate samples of size n. The nonparametric tests are based on the statistic T/sub n/, which is the… (More)

- László Györfi, Gábor Lugosi, Gusztáv Morvai
- IEEE Trans. Information Theory
- 1999

We present a simple randomized procedure for the prediction of a binary sequence. The algorithm uses ideas from recent developments of the theory of the prediction of individual sequences. We show… (More)