A principled stopping rule for importance sampling

@article{Agarwal2021APS,
  title={A principled stopping rule for importance sampling},
  author={Medha Agarwal and Dootika Vats and V{\'i}ctor Elvira},
  journal={Electronic Journal of Statistics},
  year={2021}
}
Importance sampling (IS) is a Monte Carlo technique that relies on weighted samples, simulated from a proposal distribution, to estimate intractable integrals. The quality of the estimators improves with the number of samples. However, for achieving a desired quality of estimation, the required number of samples is unknown and depends on the quantity of interest, the estimator, and the chosen proposal. We present a sequential stopping rule that terminates simulation when the overall variability… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 45 REFERENCES

Pareto Smoothed Importance Sampling

Importance weighting is a general way to adjust Monte Carlo integration to account for draws from the wrong distribution, but the resulting estimate can be noisy when the importance ratios have a

Advances in Importance Sampling

The basic IS algorithm is described and the recent advances in this methodology are revisited, focusing on multiple IS (MIS), the case where more than one proposal is available.

A Comparison Of Clipping Strategies For Importance Sampling

This paper considers the clipping transformation and test its robustness with respect to the choice of the clipping value, and proposes a novel NIS methodology, where not only a subset of weights is modified a posteriori, but also the corresponding samples are moved.

Layered adaptive importance sampling

This work introduces a layered procedure to generate samples employed within a Monte Carlo scheme, which ensures that an appropriate equivalent proposal density is always obtained automatically (thus eliminating the risk of a catastrophic performance), although at the expense of a moderate increase in the complexity.

Rethinking the Effective Sample Size

This paper revisits the approximation of the ESS in the specific context of importance sampling, and shows that the multiple assumptions and approximations in the derivation of ESS^ make it difficult to be considered even as a reasonable approximation.

Relative fixed-width stopping rules for Markov chain Monte Carlo simulations

Markov chain Monte Carlo (MCMC) simulations are commonly employed for estimating features of a target distribution, particularly for Bayesian inference. A fundamental challenge is determining when

Population Monte Carlo

The population Monte Carlo principle is described, which consists of iterated generations of importance samples, with importance functions depending on the previously generated importance sample, which can be iterated like MCMC algorithms, while being more robust against dependence and starting values, as shown in this paper.

Improving population Monte Carlo: Alternative weighting and resampling schemes

Generalized Multiple Importance Sampling

This paper establishes a general framework for sampling and weighing procedures when more than one proposal are available, the most relevant MIS schemes in the literature are encompassed within the new framework, and, moreover novel valid schemes appear naturally.

Fixed-Width Sequential Stopping Rules for a Class of Stochastic Programs

This paper develops stopping rules for sequential sampling procedures that depend on the width of an optimality gap confidence interval estimator, and presents a method that takes the schedule of sample sizes as an input and provides guidelines on the growing sample sizes.