MaxEntropy Pursuit Variational Inference

@inproceedings{Egorov2019MaxEntropyPV,
  title={MaxEntropy Pursuit Variational Inference},
  author={Evgenii Egorov and Kirill Neklyudov and Ruslan Kostoev and Evgeny Burnaev},
  booktitle={ISNN},
  year={2019}
}
One of the core problems in variational inference is a choice of approximate posterior distribution. It is crucial to trade-off between efficient inference with simple families as mean-field models and accuracy of inference. We propose a variant of a greedy approximation of the posterior distribution with tractable base learners. Using Max-Entropy approach, we obtain a well-defined optimization problem. We demonstrate the ability of the method to capture complex multimodal posterior via… Expand
BooVAE: Boosting Approach for Continual Learning of VAE
TLDR
This work introduces an end-to-end approach for continual learning of VAEs and provides empirical studies on commonly used benchmarks and avoids catastrophic forgetting in a fully automatic way. Expand
BooVAE: A scalable framework for continual VAE learning under boosting approach
TLDR
This work introduces a conceptually simple and scalable end-to-end approach of incorporating past knowledge by learning prior directly from the data, and considers scalable boosting-like approximation for intractable theoretical optimal prior. Expand

References

SHOWING 1-10 OF 31 REFERENCES
Inference Suboptimality in Variational Autoencoders
TLDR
It is found that divergence from the true posterior is often due to imperfect recognition networks, rather than the limited complexity of the approximating distribution, and the parameters used to increase the expressiveness of the approximation play a role in generalizing inference. Expand
Auto-Encoding Variational Bayes
TLDR
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced. Expand
Boosting Variational Inference: an Optimization Perspective
TLDR
This work studies the convergence properties of boosting variational inference from a modern optimization viewpoint by establishing connections to the classic Frank-Wolfe algorithm and yields novel theoretical insights regarding the sufficient conditions for convergence, explicit rates, and algorithmic simplifications. Expand
Variational Boosting: Iteratively Refining Posterior Approximations
TLDR
This work iteratively refines an existing variational approximation by solving a sequence of optimization problems, allowing the practitioner to trade computation time for accuracy, and shows how to expand the variational approximating class by incorporating additional covariance structure and by introducing new components to form a mixture. Expand
Boosting Variational Inference
TLDR
Boosting variational inference is developed, an algorithm that iteratively improves the current approximation by mixing it with a new component from the base distribution family and thereby yields progressively more accurate posterior approximations as more computing time is spent. Expand
Variational Inference with Normalizing Flows
TLDR
It is demonstrated that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference. Expand
Rényi Divergence Variational Inference
TLDR
The variational R\'enyi bound (VR) is introduced that extends traditional variational inference to R‐enyi's alpha-divergences, and a novel variational inferred method is proposed as a new special case in the proposed framework. Expand
Variational Gaussian Process
TLDR
The variational Gaussian process is constructed, a Bayesian nonparametric model which adapts its shape to match complex posterior distributions, and is proved a universal approximation theorem for the VGP, demonstrating its representative power for learning any model. Expand
Black Box Variational Inference
TLDR
This paper presents a "black box" variational inference algorithm, one that can be quickly applied to many models with little additional derivation, based on a stochastic optimization of the variational objective where the noisy gradient is computed from Monte Carlo samples from the Variational distribution. Expand
Doubly Stochastic Variational Bayes for non-Conjugate Inference
TLDR
A simple and effective variational inference algorithm based on stochastic optimisation that can be widely applied for Bayesian non-conjugate inference in continuous parameter spaces and allows for efficient use of gradient information from the model joint density is proposed. Expand
...
1
2
3
4
...