RESAMPLING FEWER THAN n OBSERVATIONS: GAINS, LOSSES, AND REMEDIES FOR LOSSES

@inproceedings{Bickel2012RESAMPLINGFT,
  title={RESAMPLING FEWER THAN n OBSERVATIONS: GAINS, LOSSES, AND REMEDIES FOR LOSSES},
  author={Peter J. Bickel and Friedrich G{\"o}tze and Willem R. van Zwet},
  year={2012}
}
We discuss a number of resampling schemes in which m = o(n) observations are resampled. We review nonparametric bootstrap failure and give results old and new on how the m out of n with replacement and without replacement bootstraps work. We extend work of Bickel and Yahav (1988) to show that m out of n bootstraps can be made second order correct, if the usual nonparametric bootstrap is correct and study how these extrapolation techniques work when the nonparametric bootstrap does not. 
Sufficient m-out-of-n (m/n) bootstrap
ABSTRACT Traditional resampling methods for estimating sampling distributions sometimes fail, and alternative approaches are then needed. For example, if the classical central limit theorem does not
The Numerical Bootstrap
This paper proposes a numerical bootstrap method that is consistent in many cases where the standard bootstrap is known to fail and where the m-out-of-n bootstrap and subsampling have been the most
ON THE BOOTSTRAP ACCURACY FOR THE TRIMMED MEAN : A SURVEY OF SOME RECENT DEVELOPMENTS
A survey of some new results on empirical Edgeworth expansions and the M out of N bootstrap accuracy for trimmed means and studentized trimmed means will be presented. We discuss the second order
A Cheap Bootstrap Method for Fast Inference
TLDR
This work presents a bootstrap methodology that uses minimal computation, namely with a resample effort as low as one Monte Carlo replication, while maintaining desirable statistical guarantees.
The Big Data Bootstrap
TLDR
The Bag of Little Bootstraps (BLB), a new procedure which incorporates features of both the bootstrap and subsampling to obtain a robust, computationally efficient means of assessing estimator quality, is presented.
On the asymptotic theory of new bootstrap confidence bounds
We propose a new method, based on sample splitting, for constructing bootstrap confidence bounds for a parameter appearing in the regular smooth function model. It has been demonstrated in the
A PARAMETRIC BOOTSTRAP FOR HEAVY-TAILED DISTRIBUTIONS
It is known that Efron’s bootstrap of the mean of a distribution in the domain of attraction of the stable laws with infinite variance is not consistent, in the sense that the limiting distribution
Bootstrap—An exploration
Trusting the Black Box: Confidence with Bag of Little Bootstraps
TLDR
The trade-off between interval accuracy and computational time is discussed, and it is shown that BLB with the proper hyperparameter values can return reliable intervals quickly.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 76 REFERENCES
The m out of n Bootstrap and Goodness of Fit Tests with Double Censored Data
This paper considers the use of the m out of n bootstrap (Bickel, Gotze, and van Zwet, 1994) in setting critical values for Cramer-von Mises goodness of fit tests with doubly censored data. We show
Exchangeably Weighted Bootstraps of the General Empirical Process
We consider an exchangeably weighted bootstrap of the general function-indexed empirical process. We find sufficient conditions on the bootstrap weights for the c~ntral limit theorem to hold for the
On the Asymptotic Properties of the Jackknife Histogram
We study the asymptotic normality of the jackknife histogram. For one sample mean, it holds if and only if r, the number of observations retained, and d (=n-r), the number of observations deleted,
When Does Bootstrap Work?: Asymptotic Results and Simulations
Bootstrap methods are procedures for estimating or approximating the distribution of a statistic based on ideas from resampling and simulation methods. This volume is concerned with the asymptotic
Second-order properties of an extrapolated bootstrap without replacement under weak assumptions
This paper shows that a straightforward extrapolation of the bootstrap distribution obtained by resampling without replacement, as considered by Politis and Romano, leads to second-order correct
When does bootstrap work
Bootstrap methods are procedures for estimating or approximating the distribution of a statistic based on ideas from resampling and simulation methods. This volume is concerned with the asymptotic
Richardson Extrapolation and the Bootstrap
Abstract Simulation methods [particularly Efron's (1979) bootstrap] are being applied more and more frequently in statistical inference. Given data (X 1 …, Xn ) distributed according to P, which
Some properties of incomplete U-statistics
SUMMARY Let g be a symmetric function with k arguments. A U-statistic is the arithmetic mean of g's based on the N = n!/{k! (n- k)!} subsamples of size k taken from a sample of size n. When N is
Some results on the influence of extremes on the bootstrap
We study the influence of the extremes in the construction of consistent bootstraps in three illustrative situations. These are bootstrapping maxima. bootstrapping intermediate trimmed means and
Bootstrapping Regression Models
Bootstrapping is a general approach to statistical inference based on building a sampling distribution for a statistic by resampling from the data at hand. The term ‘bootstrapping,’ due to Efron
...
1
2
3
4
5
...