#### Filter Results:

- Full text PDF available (115)

#### Publication Year

1981

2017

- This year (1)
- Last five years (38)

#### Publication Type

#### Co-author

#### Publication Venue

#### Data Set Used

#### Key Phrases

Learn More

- Peter Hall
- 1987

AMS 1980 subject classification: primary 62G05. secondary 62G20. Abstract: Kernel density estimators are used for the estimation of integrals of various squared derivatives of a probability density. Rates of convergence in mean squared error are calculated, which show that appropriate values of the smoothing parameter are much smaller than those for… (More)

High dimension, low sample size data are emerging in various areas of science. We find a common structure underlying many such data sets by using a non-standard type of asymptotics: the dimension tends to 1 while the sample size is fixed. Our analysis shows a tendency for the data to lie deterministically at the vertices of a regular simplex. Essentially… (More)

- Peter Hall, Hans-Georg Müller, Jane-Ling Wang, P Hall, H.-G. M¨ullerandj.-L Wang
- 2006

The use of principal component methods to analyze functional data is appropriate in a wide range of different settings. In studies of " functional data analysis, " it has often been assumed that a sample of random functions is observed precisely, in the continuum and without noise. While this has been the traditional setting for functional data analysis, in… (More)

- PETER HALL
- 2003

Martingale theory is used to obtain a central limit theorem for degenerate U-statistics with variable kernels, which is applied to derive central limit theorems for the integrated square error of multivariate nonparametric density estimators. Previous approaches to this problem have employed Komlos-Major-Tusnady type approximations to the empiric… (More)

- PETER HALL
- 2006

There has been substantial recent work on methods for estimating the slope function in linear regression for functional data analysis. However, as in the case of more conventional finite-dimensional regression, much of the practical interest in the slope centers on its application for the purpose of prediction, rather than on its significance in its own… (More)

- Peter Hall, Li-Shan Huang
- 1999

We suggest a biased-bootstrap method for monotonising general linear, kernel-type estimators, for example local linear estimators and Nadaraya-Watson estimators. Attributes of our approach include the fact that it produces smooth estimates, that is applicable to a particularly wide range of estimator types, and that it can be employed after the smoothing… (More)

Kernel methods for deconvolution have attractive features, and prevail in the literature. However, they have disadvantages, which include the fact that they are usually suitable only for cases where the error distribution is infinitely supported and its characteristic function does not ever vanish. Even in these settings, optimal convergence rates are… (More)

We develop bootstrap methods for constructing conndence regions, including intervals and simultaneous bands, in the context of estimating the intensity function of a non-stationary Poisson process. Several diierent resampling algorithms are suggested, ranging from resampling a Poisson process with intensity equal to that estimated nonparametrically from the… (More)

- Peter Hall, Jiashun Jin
- 2009

Higher Criticism is a method for detecting signals that are both sparse and weak. Although first proposed in cases where the noise variables are independent, Higher Criticism also has reasonable performance in settings where those variables are correlated. In this paper we show that, by exploiting the nature of the correlation, performance can be improved… (More)

Motivated by recent work of Joe (1989, Ann. Inst. Statist. Math., 41, 683-697), we introduce estimators of entropy and describe their properties. We study the effects of tail behaviour, distribution smoothness and di-mensionality on convergence properties. In particular, we argue that root-n consistency of entropy estimation requires appropriate assumptions… (More)