• Corpus ID: 14760225

# Fast Mixing Markov Chains for Strongly Rayleigh Measures, DPPs, and Constrained Sampling

@inproceedings{Li2016FastMM,
title={Fast Mixing Markov Chains for Strongly Rayleigh Measures, DPPs, and Constrained Sampling},
author={Chengtao Li and Suvrit Sra and Stefanie Jegelka},
booktitle={NIPS},
year={2016}
}
• Published in NIPS 2 August 2016
• Mathematics, Computer Science
We study probability measures induced by set functions with constraints. Such measures arise in a variety of real-world settings, where prior knowledge, resource limitations, or other pragmatic considerations impose constraints. We consider the task of rapidly sampling from such constrained measures, and develop fast Markov chain samplers for them. Our first main result is for MCMC sampling from Strongly Rayleigh (SR) measures, for which we present sharp polynomial bounds on the mixing time. As…

## Figures from this paper

• Computer Science
UAI
• 2018
A novel sampling strategy is proposed that uses a specific mixture of product distributions to propose global moves and, thus, accelerate convergence in discrete probabilistic models, that is, distributions over subsets of a finite ground set.
• Mathematics
NeurIPS
• 2019
This work develops two fundamental tools needed to apply SLC distributions to learning and inference: sampling and mode finding, and establishes a weak log-submodularity property for SLC functions and derive optimization guarantees for a distorted greedy algorithm.
• Computer Science
ICML
• 2017
A novel MCMC sampler that combines ideas from combinatorial geometry, linear programming, and Monte Carlo methods to sample from D PPs with a fixed sample cardinality, also called projection DPPs is built.
• Mathematics
• 2018
Strongly Rayleigh (SR) measures are discrete probability distributions over the subsets of a ground set. They enjoy strong negative dependence properties, as a result of which they assign higher
• Computer Science, Mathematics
NIPS
• 2017
This work develops an exact (randomized) polynomial time sampling algorithm as well as its derandomization and proves that its distribution satisfies the “Strong Rayleigh” property, including a provably fast-mixing Markov chain sampler that makes dual volume sampling much more attractive to practitioners.
• Computer Science, Mathematics
ArXiv
• 2018
We study the Gibbs sampling algorithm for continuous determinantal point processes. We show that, given a warm start, the Gibbs sampler generates a random sample from a continuous $k$-DPP defined on
• Computer Science, Mathematics
COLT
• 2022
Kulesza’s conjecture is proved: even computing a 1 − 1 polylog N -approximation to the maximum log-likelihood of a DPP on a ground set of N elements is NP-complete.
• Computer Science
NeurIPS
• 2018
This work defines a rich class of probabilistic models associated with constrained submodular maximization problems and proves the first constant factor guarantee in this setting — an efficiently certifiable e/(e-1) approximation of the log-partition function.
• Computer Science, Mathematics
ArXiv
• 2020
An overview of this exciting new line of research, including brief introductions to RandNLA and DPPs, as well as applications of D PPs to classical linear algebra tasks such as least squares regression, low-rank approximation and the Nystrom method are provided.
• Computer Science
ICML
• 2021
This paper exhibits a simple class of PGCs that are not trivially subsumed by simple combinations of PCs and DPPs, and obtain competitive performance on a suite of density estimation benchmarks.

## References

SHOWING 1-10 OF 39 REFERENCES

• Computer Science
NIPS
• 2015
This paper investigates the use of Markov chain Monte Carlo sampling to perform approximate inference in general log-submodular and log-supermodular models and investigates the efficiency of the Gibbs sampler on three examples of such models.
• Mathematics, Computer Science
COLT
• 2015
It is shown that if the set function displays a natural notion of decay of correlation, then, for $\beta$ small enough, it is possible to design fast mixing Markov chain Monte Carlo methods that yield error bounds on marginal approximations that do not depend on the size of the set $V$.
• A. Sinclair
• Computer Science
Combinatorics, Probability and Computing
• 1992
A new upper bound on the mixing rate is presented, based on the solution to a multicommodity flow problem in the Markov chain viewed as a graph, and improved bounds are obtained for the runtimes of randomised approximation algorithms for various problems, including computing the permanent of a 0–1 matrix, counting matchings in graphs, and computing the partition function of a ferromagnetic Ising system.
• Mathematics
Proceedings 38th Annual Symposium on Foundations of Computer Science
• 1997
A new approach to the coupling technique, which is called path coupling, for bounding mixing rates, is illustrated, which may allow coupling proofs which were previously unknown, or provide significantly better bounds than those obtained using the standard method.
• Computer Science
• 1992
The focus is on applied inference for Bayesian posterior distributions in real problems, which often tend toward normal- ity after transformations and marginalization, and the results are derived as normal-theory approximations to exact Bayesian inference, conditional on the observed simulations.
• Computer Science, Mathematics
NIPS
• 2010
This work proposes a new framework that extends variational inference to a wide range of combinatorial spaces, based on a simple assumption: the existence of a tractable measure factorization, which it is shown holds in many examples.
• Computer Science
Found. Trends Mach. Learn.
• 2012
Determinantal Point Processes for Machine Learning provides a comprehensible introduction to DPPs, focusing on the intuitions, algorithms, and extensions that are most relevant to the machine learning community, and shows how they can be applied to real-world applications.
• Computer Science
NIPS
• 2013
This work proposes a sampling algorithm, called PAWS, based on embedding the set into a higher-dimensional space which is then randomly projected using universal hash functions to a lower-dimensional subspace and explored using combinatorial search methods.
A rapidly mixing Markov chain is constructed, from which a sample can be acquired from the given DPP in sub-cubic time, and it is shown that this framework can be extended to sampling from cardinality-constrained DPPs, resulting in better clustering.
• Mathematics
• 2007
We introduce the class of strongly Rayleigh probability measures by means of geometric properties of their generating polynomials that amount to the stability of the latter. This class covers