• Corpus ID: 231801965

Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC

@article{Jaini2021SamplingIC,
  title={Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC},
  author={Priyank Jaini and Didrik Nielsen and Max Welling},
  journal={ArXiv},
  year={2021},
  volume={abs/2102.02374}
}
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions. However, a major limitation of HMC is its inability to be applied to discrete domains due to the lack of gradient signal. In this work, we introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions using a combination of neural transport methods like normalizing flows and variational dequantization, and the… 

Figures and Tables from this paper

Discrete Langevin Sampler via Wasserstein Gradient Flow

TLDR
This work shows how LB functions give rise to LB dynamics corresponding to Wasserstein gradient in a discrete space, and proposes a new algorithm, the Locally Balanced Jump (LBJ), by discretizing the LB dynamics with respect to simulation time.

A Langevin-like Sampler for Discrete Distributions

TLDR
The efficiency of DLP is proved by showing that the asymptotic bias of its stationary distribution is zero for log-quadratic distributions, and is small for distributions that are close to being log- quadratic.

P ATH A UXILIARY P ROPOSAL FOR MCMC IN D IS CRETE S PACE

TLDR
A path auxiliary algorithm that uses a composition of local moves to explore large neigh-borhoods and considerably outperform other generic samplers on various discrete models for sampling, inference, and learning.

LSB: Local Self-Balancing MCMC in Discrete Spaces

We present the Local Self-Balancing sampler (LSB), a local Markov Chain Monte Carlo (MCMC) method for sampling in purely discrete domains, which is able to autonomously adapt to the target

Learning Equivariant Energy Based Models with Equivariant Stein Variational Gradient Descent

TLDR
Equivariant energy based models are defined to model invariant densities that are learned using contrastive divergence and applied for modelling joint densities in regression and classification tasks for image datasets, many-body particle systems and molecular structure generation.

References

SHOWING 1-10 OF 33 REFERENCES

Hybrid Monte Carlo

PyTorch: An Imperative Style, High-Performance Deep Learning Library

TLDR
This paper details the principles that drove the implementation of PyTorch and how they are reflected in its architecture, and explains how the careful and pragmatic implementation of the key components of its runtime enables them to work together to achieve compelling performance.

Flow-based generative models for Markov chain Monte Carlo in lattice field theory

TLDR
A Markov chain update scheme using a machine-learned flow-based generative model is proposed for Monte Carlo sampling in lattice field theories and is compared with HMC and local Metropolis sampling for ϕ4 theory in two dimensions.

Transport Map Accelerated Markov Chain Monte Carlo

We introduce a new framework for efficient sampling from complex probability distributions, using a combination of transport maps and the Metropolis--Hastings rule. The core idea is to use determin...

Discontinuous Hamiltonian Monte Carlo for discrete parameters and discontinuous likelihoods

TLDR
An extension of Hamiltonian Monte Carlo that can efficiently explore target distributions with discontinuous densities and enables efficient sampling from ordinal parameters though embedding of probability mass functions into continuous spaces is presented.

Flow++: Improving Flow-Based Generative Models with Variational Dequantization and Architecture Design

TLDR
Flow++ is proposed, a new flow-based model that is now the state-of-the-art non-autoregressive model for unconditional density estimation on standard image benchmarks, and has begun to close the significant performance gap that has so far existed between autoregressive models and flow- based models.

Reflection, Refraction, and Hamiltonian Monte Carlo

TLDR
A modification of the Leapfrog discretization of Hamiltonian dynamics on piecewise continuous energies, where intersections of the trajectory with discontinuities are detected, and the momentum is reflected or refracted to compensate for the change in energy.

Auxiliary-variable Exact Hamiltonian Monte Carlo Samplers for Binary Distributions

We present a new approach to sample from generic binary distributions, based on an exact Hamiltonian Monte Carlo algorithm applied to a piecewise continuous augmentation of the binary distribution of

SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows

TLDR
SurVAE Flows bridge the gap between normalizing flows and VAEs with surjective transformations, wherein the transformations are deterministic in one direction -- thereby allowing exact likelihood computation, and stochastic in the reverse direction -- hence providing a lower bound on the corresponding likelihood.

Calculation of Partition Functions