Adaptive approximate Bayesian computation

@article{Beaumont2009AdaptiveAB,
  title={Adaptive approximate Bayesian computation},
  author={Mark A. Beaumont and Jean-Marie Cornuet and Jean-Michel Marin and Christian P. Robert},
  journal={Biometrika},
  year={2009},
  volume={96},
  pages={983-990}
}
Sequential techniques can enhance the efficiency of the approximate Bayesian computation algorithm, as in Sisson et al.'s (2007) partial rejection control version. While this method is based upon the theoretical works of Del Moral et al. (2006), the application to approximate Bayesian computation results in a bias in the approximation to the posterior. An alternative version based on genuine importance sampling arguments bypasses this difficulty, in connection with the population Monte Carlo… Expand

Figures from this paper

Adaptive Approximate Bayesian Computation Tolerance Selection
TLDR
A method for adaptively selecting a sequence of tolerances that improves the computational efficiency of the algorithm over other common techniques is proposed and a stopping rule is defined as a by-product of the adaptation procedure, which assists in automating termination of sampling. Expand
A consistent statistical model selection for abrupt glacial climate changes
The most pronounced mode of climate variability during the last glacial period are the so-called Dansgaard–Oeschger events. There is no consensus of the underlying dynamical mechanism of these abruptExpand
Optimal proposals for Approximate Bayesian Computation.
We derive the optimal proposal density for Approximate Bayesian Computation (ABC) using Sequential Monte Carlo (SMC) (or Population Monte Carlo, PMC). The criterion for optimality is that theExpand
Adaptive approximate Bayesian computation for complex models
TLDR
A new approximate Bayesian computation (ABC) algorithm that aims at minimizing the number of model runs for reaching a given quality of the posterior approximation and makes use of an easily interpretable stopping criterion. Expand
Maximum likelihood parameter estimation in time series models using sequential Monte Carlo
Time series models are used to characterise uncertainty in many real-world dynamical phenomena. A time series model typically contains a static variable, called parameter, which parametrizes theExpand
Initialize and Calibrate a Dynamic Stochastic Microsimulation Model : application to the SimVillages Model. (Initialiser et calibrer un modèle de microsimulation dynamique stochastique : application au modèle SimVillages)
TLDR
Developing statistical tools to initialize and to calibrate dynamic stochastic microsimulation models, starting from their application to the SimVillages model, which includes demographic and economic dynamics applied to the population of a set of rural municipalities. Expand
Likelihood-Free Bayesian Modeling
Bayesian modeling has been influential in cognitive science. However, many psychological models of behavior have difficult or intractable likelihood functions. This poses a major problem for BayesianExpand
Statistical inference on evolutionary processes in Alpine ibex (Capra ibex): mutation, migration and selection
TLDR
The fundamental processes of evolution – mutation, recombination, selection, gene flow and genetic drift – are reviewed, and an overview of Bayesian inference in statistical population genetics is given and a particular focus is devoted to approximate Bayesian computation (ABC) in chapter 2. Expand
Using Approximate Bayesian Computation to Estimate Transmission Rates of Nosocomial Pathogens
In this paper, we apply a simulation based approach for estimating transmission rates of nosocomial pathogens. In particular, the objective is to infer the transmission rate between colonisedExpand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 19 REFERENCES
Sequential Monte Carlo without likelihoods
TLDR
This work proposes a sequential Monte Carlo sampler that convincingly overcomes inefficiencies of existing methods and demonstrates its implementation through an epidemiological study of the transmission rate of tuberculosis. Expand
Approximate Bayesian computation in population genetics.
TLDR
A key advantage of the method is that the nuisance parameters are automatically integrated out in the simulation step, so that the large numbers of nuisance parameters that arise in population genetics problems can be handled without difficulty. Expand
Adaptive importance sampling in general mixture classes
TLDR
An adaptive algorithm that iteratively updates both the weights and component parameters of a mixture importance sampling density so as to optimise the performance of importance sampling, as measured by an entropy criterion is proposed. Expand
Inferring population history with DIY ABC: a user-friendly approach to approximate Bayesian computation
TLDR
Key methods used in DIY ABC, a computer program for inference based on approximate Bayesian computation (ABC), in which scenarios can be customized by the user to fit many complex situations involving any number of populations and samples, are described. Expand
Inferring population history with DIYABC: a user-friendly approach to Approx
  • 2008
Convergence of adaptive mixtures of importance sampling schemes
TLDR
This work derives sufficient convergence conditions for adaptive mixtures of population Monte Carlo algorithms and shows that Rao--Blackwellized versions asymptotically achieve an optimum in terms of a Kullback divergence criterion, while more rudimentary versions do not benefit from repeated updating. Expand
Inference for Stereological Extremes
In the production of clean steels, the occurrence of imperfections—so-called “inclusions”—is unavoidable. The strength of a clean steel block is largely dependent on the size of the largestExpand
Sequential Monte Carlo samplers for rare events
Abstract. We present novel sequential Monte Carlo (SMC) algorithms for the simulation of two broad classes of rare events which are suitable for the estimation of tail probabilities and probabilityExpand
Convergence of Adaptive Sampling Schemes
TLDR
This work derives sucient convergence conditions for a wide class of population Monte Carlo algorithms and shows that Rao‐ Blackwellized versions asymptotically achieve an optimum in terms of a Kullback divergence criterion, while more rudimentary versions simply do not benefit from repeated updating. Expand
Population Monte Carlo
Importance sampling methods can be iterated like MCMC algorithms, while being more robust against dependence and starting values. The population Monte Carlo principle consists of iterated generationsExpand
...
1
2
...