An Auxiliary Variational Method

@inproceedings{Agakov2004AnAV,
  title={An Auxiliary Variational Method},
  author={Felix V. Agakov and David Barber},
  booktitle={ICONIP},
  year={2004}
}
Variational methods have proved popular and effective for inference and learning in intractable graphical models. An attractive feature of the approaches based on the Kullback-Leibler divergence is a rigorous lower bound on the normalization constants in undirected models. In the suggested work we explore the idea of using auxiliary variables to improve on the lower bound of standard mean field methods. Our approach forms a more powerful class of approximations than any structured mean field… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 36 CITATIONS

Importance Weighted Hierarchical Variational Inference

VIEW 17 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Variational f-divergence Minimization

VIEW 4 EXCERPTS
CITES METHODS

Auxiliary Deep Generative Models

VIEW 5 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Energy-Inspired Models: Learning with Sampler-Induced Distributions

VIEW 5 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

Exploiting Hierarchy for Learning and Transfer in KL-regularized RL

VIEW 1 EXCERPT
CITES BACKGROUND

References

Publications referenced by this paper.
SHOWING 1-10 OF 17 REFERENCES

A View of the Em Algorithm that Justifies Incremental, Sparse, and other Variants

VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

Improving the Mean Field Approximation Via the Use of Mixture Distributions

VIEW 3 EXCERPTS
HIGHLY INFLUENTIAL

A new class of upper bounds on the log partition function

VIEW 2 EXCERPTS