Enhanced gradient-based MCMC in discrete spaces

  title={Enhanced gradient-based MCMC in discrete spaces},
  author={Benjamin Rhodes and Michael U Gutmann},
The recent introduction of gradient-based Markov chain Monte Carlo (MCMC) for discrete spaces holds great promise, and comes with the tantalising possibility of new discrete counterparts to celebrated continuous methods such as the Metropolis-adjusted Langevin algorithm (MALA). To-wards this goal, we introduce several discrete Metropolis-Hastings samplers that are conceptually inspired by MALA, and demonstrate their strong empirical performance across a range of challenging sampling problems in… 

Figures and Tables from this paper

Discrete Langevin Sampler via Wasserstein Gradient Flow

This work shows how the Wasserstein gradient flow can be generalized naturally to discrete spaces and reveals how recent gradient-based samplers in discrete spaces can be obtained as special cases by choosing particular discretizations.

An approximate sampler for energy-based models with divergence diagnostics

Energy-based models (EBMs) allow flexible specifications of probability distributions. However, sampling from EBMs is non-trivial, usually requiring approximate techniques such as Markov chain Monte

Illuminating protein space with a programmable generative model

Chroma is introduced, a generative model for proteins and protein complexes that can directly sample novel protein structures and sequences and that can be conditioned to steer the generative process towards desired properties and functions, and suggested that Chroma can effectively realize protein design as Bayesian inference under external constraints.



Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC

This work introduces a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions using a combination of neural transport methods like normalizing flows and variational dequantization, and the MetropolisHastings rule.

Informed Proposals for Local MCMC in Discrete Spaces

  • Giacomo Zanella
  • Computer Science
    Journal of the American Statistical Association
  • 2019
This work proposes a simple framework for the design of informed MCMC proposals (i.e., Metropolis–Hastings proposal distributions that appropriately incorporate local information about the target) which is naturally applicable to discrete spaces and provides orders of magnitude improvements in efficiency compared to alternative MCMC schemes.

A Complete Recipe for Stochastic Gradient MCMC

This paper provides a general recipe for constructing MCMC samplers--including stochastic gradient versions--based on continuous Markov processes specified via two matrices, and uses the recipe to straightforwardly propose a new state-adaptive sampler: stochastics gradient Riemann Hamiltonian Monte Carlo (SGRHMC).

A Langevin-like Sampler for Discrete Distributions

DLP outperforms many popular alternatives on a wide variety of tasks, including Ising models, restricted Boltzmann machines, deep energy-based models, binary neural networks and language generation, and it is hypothesis that the need to run DULA for more steps per iteration to get better results.

Parallelizable Sampling of Markov Random Fields

A new Markov Chain transition operator is introduced that updates all the variables of a pairwise MRF in parallel by using auxiliary Gaussian variables and implies that the later can be learned in place of the former without any loss of modeling power.

A tutorial on adaptive MCMC

This work proposes a series of novel adaptive algorithms which prove to be robust and reliable in practice and reviews criteria and the useful framework of stochastic approximation, which allows one to systematically optimise generally used criteria.

Auxiliary-variable Exact Hamiltonian Monte Carlo Samplers for Binary Distributions

We present a new approach to sample from generic binary distributions, based on an exact Hamiltonian Monte Carlo algorithm applied to a piecewise continuous augmentation of the binary distribution of

Continuous Relaxations for Discrete Hamiltonian Monte Carlo

It is shown that a general form of the Gaussian Integral Trick makes it possible to transform a wide class of discrete variable undirected models into fully continuous systems, which opens up a number of new avenues for inference in difficult discrete systems.

Probabilistic Path Hamiltonian Monte Carlo

Probabilistic Path HMC (PPHMC) is developed as a first step to sampling distributions on spaces with intricate combinatorial structure, and a surrogate function to ease the transition across a boundary on which the log-posterior has discontinuous derivatives can greatly improve efficiency.