Large deviations for bootstrapped empirical measures

  title={Large deviations for bootstrapped empirical measures},
  author={Jos{\'e} Trashorras and Olivier Wintenberger},
  journal={arXiv: Probability},
We investigate the Large Deviations properties of bootstrapped empirical measure with exchangeable weights. Our main result shows in great generality how the resulting rate function combines the LD properties of both the sample weights and the observations. As an application we recover known conditional and unconditional LDPs and obtain some new ones. 

A weighted bootstrap procedure for divergence minimization problems

This work proposes a weighted bootstrap procedure, which produces an explicit procedure which defines the weights, therefore replacing a variational problem in the space of measures by a simple Monte Carlo procedure.

Weighted Sampling, Maximum Likelihood and Minimum Divergence Estimators

This paper explores Maximum Likelihood in parametric models in the context of Sanov type Large Deviation Probabilities in the framework of Sanovo type LargeDeviation Probability models.

Contributions à la statistique des processus : estimation, prédiction et extrêmes

This habilitation manuscript presents my research work on statistics for weakly dependent processes. Asymptotical results for the Quasi Maximum Likelihood Estimator in general affine models are given

Sup-Sums Principles for F-Divergence and a New Definition for t-Entropy

New principles for integral F -divergence for arbitrary convex functions F on the whole real axis and arbitrary measures and a new ‘integral’ definition for t -entropy explicitly establishing its relation to Kullback–Leibler divergence are presented.

A Unifying Framework for Some Directed Distances in Statistics

This paper provides a general framework which covers in particular both the abovementioned density-based and distribution-function-based divergence approaches; the dissimilarity of quantiles respectively of other statistical functionals will be included as well.

Minimum Divergence Estimators, Maximum Likelihood and the Generalized Bootstrap

This paper states that most commonly used minimum divergence estimators are MLEs for suited generalized bootstrapped sampling schemes. Optimality in the sense of Bahadur for associated tests of fit



Exchangeably Weighted Bootstraps of the General Empirical Process

We consider an exchangeably weighted bootstrap of the general function-indexed empirical process. We find sufficient conditions on the bootstrap weights for the c~ntral limit theorem to hold for the

Large deviations for subsampling from individual sequences


In this paper, we study the problem of large deviations for measures with random weights. We are motivated by previous work dealing with the special case occuring in the statistical mechanics of the

Large and moderate deviations for matching problems and empirical discrepancies

We study the two-sample matching problem and its connections with the Monge-Kantorovich problem of optimal transportation of mass. We exploit this connection to obtain moderate and large deviation

Some Limit Theorems in Statistics

Moment-generating Functions Chernoff's Theorem The Kullback- Leibler Information Number Some Examples of Large Deviation Probabilities Stein's Lemma Asymptotic Effective Variances Exact Slopes of

Large Deviations Techniques and Applications

The LDP for Abstract Empirical Measures and applications-The Finite Dimensional Case and Applications of Empirically Measures LDP are presented.

Large Deviations for Processes with Independent Increments

Abstract : The establishment of the large deviation principle (LDP) has had important implications in various areas in statistics. It has been used to obtain the asymptotic efficiencies of tests and


SUMMARY. In this paper we obtain some useful properties of the Kullback-Leibler (K-L) number. For example, we show that the K-L number is jointly lower semi-continuous in both arguments, on the class

Stein’s method for the bootstrap

This paper gives new proofs for many known results about the convergence in law of the bootstrap distribution to the true distribution of smooth statistics, whether the samples studied come from

Bootstrap relative errors and sub-exponential distributions

For the purposes of this paper, a distribution is sub-exponential if it has finite variance but its moment generating function is infinite on at least one side of the origin. The principal aim here