Corpus ID: 57373870

Discrete Neural Processes

@article{Pakman2019DiscreteNP,
  title={Discrete Neural Processes},
  author={Ari Pakman and L. Paninski},
  journal={ArXiv},
  year={2019},
  volume={abs/1901.00409}
}
Many data generating processes involve latent random variables over discrete combinatorial spaces whose size grows factorially with the dataset. In these settings, existing posterior inference methods can be inaccurate and/or very slow. In this work we develop methods for efficient amortized approximate Bayesian inference over discrete combinatorial spaces, with applications to random permutations, probabilistic clustering (such as Dirichlet process mixture models) and random communities (such… Expand
Deep Amortized Clustering
TLDR
It is empirically show, on both synthetic and image data, that DAC can efficiently and accurately cluster new datasets coming from the same distribution used to generate training datasets. Expand
Self-Supervised Prototype Representation Learning for Event-Based Corporate Profiling
TLDR
A Self-Supervised Prototype Representation Learning (SePaL) framework for dynamic corporate profiling can obtain unified corporate representations that are robust to event noises and can be easily finetuned to benefit various down-stream applications with only a few annotated data. Expand
Disentangled sticky hierarchical Dirichlet process hidden Markov model
TLDR
The disentangled sticky HDP-HMM is proposed, which outperforms the sticky HDHMM and HDM on both synthetic and real data, and is applied to analyze neural data and segment behavioral video data. Expand

References

SHOWING 1-10 OF 74 REFERENCES
Inference Networks for Sequential Monte Carlo in Graphical Models
TLDR
A procedure for constructing and learning a structured neural network which represents an inverse factorization of the graphical model, resulting in a conditional density estimator that takes as input particular values of the observed random variables, and returns an approximation to the distribution of the latent variables. Expand
A Survey of Non-Exchangeable Priors for Bayesian Nonparametric Models
  • N. Foti, Sinead Williamson
  • Computer Science, Mathematics
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • 2015
TLDR
An understanding of which, it is hoped, will help in selecting an appropriate prior, developing new models, and leveraging inference techniques. Expand
Collapsed Variational Dirichlet Process Mixture Models
TLDR
A number of variational Bayesian approximations to the Dirichlet process (DP) mixture model are studied and a novel collapsed VB approximation where mixture weights are marginalized out is considered. Expand
Learning Stochastic Inverses
TLDR
The Inverse MCMC algorithm is described, which uses stochastic inverses to make block proposals for a Metropolis-Hastings sampler, and the efficiency of this sampler for a variety of parameter regimes and Bayes nets is explored. Expand
Reliable and Scalable Variational Inference for the Hierarchical Dirichlet Process
TLDR
This work defines full variational posteriors for all latent variables and optimize parameters via a novel surrogate likelihood bound for hierarchical Dirichlet process admixture models and shows that this approach has crucial advantages for data-driven learning of the number of topics. Expand
Memoized Online Variational Inference for Dirichlet Process Mixture Models
TLDR
A new algorithm, memoized online variational inference, which scales to very large (yet finite) datasets while avoiding the complexities of stochastic gradient is presented, requiring some additional memory but still scaling to millions of examples. Expand
Variational methods for the Dirichlet process
TLDR
A mean-field variational approach to approximate inference for the Dirichlet process, where the approximate posterior is based on the truncated stick-breaking construction (Ishwaran & James, 2001). Expand
Mixture Models With a Prior on the Number of Components
TLDR
It turns out that many of the essential properties of DPMs are also exhibited by MFMs, and the MFM analogues are simple enough that they can be used much like the corresponding DPM properties; this simplifies the implementation of MFMs and can substantially improve mixing. Expand
Markov Chain Sampling Methods for Dirichlet Process Mixture Models
Abstract This article reviews Markov chain methods for sampling from the posterior distribution of a Dirichlet process mixture model and presents two new classes of methods. One new approach is toExpand
Mixed Membership Stochastic Blockmodels
TLDR
This paper describes a latent variable model of such data called the mixed membership stochastic blockmodel, which extends blockmodels for relational data to ones which capture mixed membership latent relational structure, thus providing an object-specific low-dimensional representation. Expand
...
1
2
3
4
5
...