A Repelling–Attracting Metropolis Algorithm for Multimodality

  title={A Repelling–Attracting Metropolis Algorithm for Multimodality},
  author={Hyungsuk Tak and Xiao-Li Meng and David A. van Dyk},
  journal={Journal of Computational and Graphical Statistics},
  pages={479 - 490}
ABSTRACT Although the Metropolis algorithm is simple to implement, it often has difficulties exploring multimodal distributions. We propose the repelling–attracting Metropolis (RAM) algorithm that maintains the simple-to-implement nature of the Metropolis algorithm, but is more likely to jump between modes. The RAM algorithm is a Metropolis-Hastings algorithm with a proposal that consists of a downhill move in density that aims to make local modes repelling, followed by an uphill move in… 
MCMC adaptatifs à essais multiples
The MCMC methods, along with their adaptive and multiple-try extensions, are thoroughly explored in order to firmly anchor the study of the proposed adaptive Multiple-Try Metropolis (aMTM) algorithm.
Sampling from multimodal distributions using tempered Hamiltonian transitions
This paper develops a Hamiltonian Monte Carlo method where the constructed paths can travel across high potential energy barriers and can construct globally mixing Markov chains targeting high-dimensional, multimodal distributions, using mixtures of normals and a sensor network localization problem.
A framework for adaptive MCMC targeting multimodal distributions
A new Monte Carlo method for sampling from multimodal distributions based on splitting the task into two: finding the modes of a target distribution and sampling, given the knowledge of the locations of the modes is proposed.
A Metropolis-class sampler for targets with non-convex support
Theoretical and numerical evidence of improved performance relative to random walk Metropolis are provided and numerical examples, including applications to global optimisation and rare event sampling, are presented.
Irreversible samplers from jump and continuous Markov processes
This paper shows how the previously proposed MALA method can be extended to exploit irreversible stochastic dynamics as proposal distributions in the I-Jump sampler and explores how irreversibility can increase the efficiency of the samplers in different situations.
Stochastic approximation Hamiltonian Monte Carlo
A Stochastic Approximate Hamilton Monte Carlo (SAHMC) algorithm for generating samples from multimodal density under the Hamiltonian Monte Carlo framework that can adaptively lower the energy barrier to move theHamiltonian trajectory more frequently and more easily between modes.
The Skipping Sampler: A new approach to sample from complex conditional densities.
We introduce the Skipping Sampler, a novel algorithm to efficiently sample from the restriction of an arbitrary probability density to an arbitrary measurable set. Such conditional densities can
Multimodal information gain in Bayesian design of experiments
It is shown that the novel global-local multimodal approach can be significantly more accurate and more efficient than the other existing approaches, especially when the number of modes is large.
Pseudo-Extended Markov chain Monte Carlo
The pseudo-extended MCMC method is introduced as a simple approach for improving the mixing of the MCMC sampler for multi-modal posterior distributions by augments the state-space of the posterior using pseudo-samples as auxiliary variables.
Globally-centered autocovariances in MCMC
Autocovariances are a fundamental quantity of interest in Markov chain Monte Carlo (MCMC) simulations with autocorrelation function (ACF) plots being an integral visualization tool for performance


Wormhole Hamiltonian Monte Carlo
This work proposes a novel Bayesian inference approach based on Markov Chain Monte Carlo that can effectively sample from multimodal distributions, especially when the dimension is high and the modes are isolated.
Delayed rejection schemes for efficient Markov-Chain Monte-Carlo sampling of multimodal distributions
A number of problems in a variety of fields are characterised by target distributions with a multimodal structure in which the presence of several isolated local maxima dramatically reduces the
Sampling from multimodal distributions using tempered transitions
A new Markov chain sampling method appropriate for distributions with isolated modes that uses a series of distributions that interpolate between the distribution of interest and a distribution for which sampling is easier, with the advantage that it does not require approximate values for the normalizing constants of these distributions.
Tuning tempered transitions
This work considers how the tempered transitions algorithm may be tuned to increase the acceptance rates for a given number of temperatures and finds that the commonly assumed geometric spacing of temperatures is reasonable in many but not all applications.
Understanding the Metropolis-Hastings Algorithm
Abstract We provide a detailed, introductory exposition of the Metropolis-Hastings algorithm, a powerful Markov chain method to simulate multivariate distributions. A simple, intuitive derivation of
Annealing Markov chain Monte Carlo with applications to ancestral inference
This work proposes MCMC methods distantly related to simulated annealing, which simulate realizations from a sequence of distributions, allowing the distribution being simulated to vary randomly over time.
Mode Jumping Proposals in MCMC
Markov chain Monte Carlo algorithms generate samples from a target distribution by simulating a Markov chain. Large flexibility exists in specification of transition matrix of the chain. In practice,
Monte Carlo Sampling Methods Using Markov Chains and Their Applications
SUMMARY A generalization of the sampling method introduced by Metropolis et al. (1953) is presented along with an exposition of the relevant theory, techniques of application and methods and
Equi-energy sampler with applications in statistical inference and statistical mechanics
Kou, Zhou and Wong have introduced a novel sampling method, the equi-energy sampler, which could contribute significantly to the field of structural prediction, and a very closely related method, multicanonical sampling (MCS).
Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
  • S. Geman, D. Geman
  • Physics
    IEEE Transactions on Pattern Analysis and Machine Intelligence
  • 1984
The analogy between images and statistical mechanics systems is made and the analogous operation under the posterior distribution yields the maximum a posteriori (MAP) estimate of the image given the degraded observations, creating a highly parallel ``relaxation'' algorithm for MAP estimation.