GumBolt: Extending Gumbel trick to Boltzmann priors
@inproceedings{Khoshaman2018GumBoltEG, title={GumBolt: Extending Gumbel trick to Boltzmann priors}, author={Amir Khoshaman and Mohammad H. Amin}, booktitle={NeurIPS}, year={2018} }
Boltzmann machines (BMs) are appealing candidates for powerful priors in variational autoencoders (VAEs), as they are capable of capturing nontrivial and multi-modal distributions over discrete variables. [] Key Method GumBolt is significantly simpler than the recently proposed methods with BM prior and outperforms them by a considerable margin. It achieves state-of-the-art performance on permutation invariant MNIST and OMNIGLOT datasets in the scope of models with only discrete latent variables. Moreover…
7 Citations
Direct Evolutionary Optimization of Variational Autoencoders With Binary Latents
- Computer ScienceArXiv
- 2020
The studied approach shows that training of VAEs is indeed possible without sampling-based approximation and reparameterization, and makes VAEs competitive where they have previously been outperformed by non-generative approaches.
Learning Undirected Posteriors by Backpropagation through MCMC Updates
- Computer ScienceArXiv
- 2019
An efficient method to train undirected posteriors is developed by showing that the gradient of the training objective with respect to the parameters of the Undirected posterior can be computed by backpropagation through Markov chain Monte Carlo updates.
PixelVAE++: Improved PixelVAE with Discrete Prior
- Computer ScienceArXiv
- 2019
Constructing powerful generative models for natural images is a challenging task. PixelCNN models capture details and local information in images very well but have limited receptive field.…
Undirected Graphical Models as Approximate Posteriors
- Computer ScienceICML
- 2020
An efficient method to train undirected approximate posteriors is developed by showing that the gradient of the training objective with respect to the parameters of the Undirected posterior can be computed by backpropagation through Markov chain Monte Carlo updates.
High-Dimensional Similarity Search with Quantum-Assisted Variational Autoencoder
- Computer ScienceKDD
- 2020
This work extends on previous work and study the real-world applicability of a QVAE by presenting a proof-of-concept for similarity search in large-scale high-dimensional datasets and showing how to construct a space-efficient search index based on the latent space representation of aQVAE.
CaloDVAE : Discrete Variational Autoencoders for Fast Calorimeter Shower Simulation
- Physics
- 2021
Calorimeter simulation is the most computationally expensive part of Monte Carlo generation of samples necessary for analysis of experimental data at the Large Hadron Collider (LHC). The…
From Ans\"atze to Z-gates: a NASA View of Quantum Computing
- Computer Science, Physics
- 2019
Early application thrusts related to robustness of communication networks and the simulation of many-body systems for material science and chemistry are added to the QuAIL research agenda.
References
SHOWING 1-10 OF 43 REFERENCES
Importance Weighted Autoencoders
- Computer ScienceICLR
- 2016
The importance weighted autoencoder (IWAE), a generative model with the same architecture as the VAE, but which uses a strictly tighter log-likelihood lower bound derived from importance weighting, shows empirically that IWAEs learn richer latent space representations than VAEs, leading to improved test log- likelihood on density estimation benchmarks.
Variational Inference for Monte Carlo Objectives
- Computer ScienceICML
- 2016
The first unbiased gradient estimator designed for importance-sampled objectives is developed, which is both simpler and more effective than the NVIL estimator proposed for the single-sample variational objective, and is competitive with the currently used biases.
Ladder Variational Autoencoders
- Computer ScienceNIPS
- 2016
A new inference model is proposed, the Ladder Variational Autoencoder, that recursively corrects the generative distribution by a data dependent approximate likelihood in a process resembling the recently proposed Ladder Network.
Categorical Reparameterization with Gumbel-Softmax
- Computer Science, MathematicsICLR
- 2017
It is shown that the Gumbel-Softmax estimator outperforms state-of-the-art gradient estimators on structured output prediction and unsupervised generative modeling tasks with categorical latent variables, and enables large speedups on semi-supervised classification.
Auto-Encoding Variational Bayes
- Computer ScienceICLR
- 2014
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.
Quantum variational autoencoder
- Computer ScienceQuantum Science and Technology
- 2018
A quantum variational autoencoder (QVAE) is introduced: a VAE whose latent generative process is implemented as a quantum Boltzmann machine (QBM), which can be trained end-to-end by maximizing a well-defined loss-function: a ‘quantum’ lower-bound to a variational approximation of the log-likelihood.
MADE: Masked Autoencoder for Distribution Estimation
- Computer ScienceICML
- 2015
This work introduces a simple modification for autoencoder neural networks that yields powerful generative models and proves that this approach is competitive with state-of-the-art tractable distribution estimators.
Variational Lossy Autoencoder
- Computer ScienceICLR
- 2017
This paper presents a simple but principled method to learn global representations by combining Variational Autoencoder (VAE) with neural autoregressive models such as RNN, MADE and PixelRNN/CNN with greatly improve generative modeling performance of VAEs.
Tackling Over-pruning in Variational Autoencoders
- Computer ScienceArXiv
- 2017
The epitomic variational autoencoder (eVAE) is proposed, which makes efficient use of model capacity and generalizes better than VAE and helps prevent inactive units since each group is pressured to explain the data.
DVAE++: Discrete Variational Autoencoders with Overlapping Transformations
- Computer ScienceICML
- 2018
DVAE++ is developed, a generative model with a global discrete prior and a hierarchy of convolutional continuous variables, and a new variational bound to efficiently train with Boltzmann machine priors is derived.