• Corpus ID: 15207178

Message passing with relaxed moment matching

@article{Qi2012MessagePW,
  title={Message passing with relaxed moment matching},
  author={Yuan Qi and Yandong Guo},
  journal={ArXiv},
  year={2012},
  volume={abs/1204.4166}
}
Bayesian learning is often hampered by large computational expense. As a powerful generalization of popular belief propagation, expectation propagation (EP) efficiently approximates the exact Bayesian computation. Nevertheless, EP can be sensitive to outliers and suffer from divergence for difficult cases. To address this issue, we propose a new approximate inference approach, relaxed expectation propagation (REP). It relaxes the moment matching requirement of expectation propagation by adding… 

Figures from this paper

References

SHOWING 1-10 OF 13 REFERENCES
A family of algorithms for approximate Bayesian inference
TLDR
This thesis presents an approximation technique that can perform Bayesian inference faster and more accurately than previously possible, and is found to be convincingly better than rival approximation techniques: Monte Carlo, Laplace's method, and variational Bayes.
Expectation Consistent Approximate Inference
TLDR
A novel framework for approximations to intractable probabilistic models which is based on a free energy formulation is proposed which requires two tractable probability distributions which are made consistent on a set of moments and encode different features of the originalintractable distribution.
Divergence measures and message passing
This paper presents a unifying view of messagepassing algorithms, as methods to approximate a complex Bayesian network by a simpler network with minimum information divergence. In this view, the
Fractional Belief Propagation
TLDR
Fractional belief propagation is formulated in terms of a family of approximate free energies, which includes the Bethe free energy and the naive mean-field free as special cases, and using the linear response correction of the clique marginals, the scale parameters can be tuned.
CCCP Algorithms to Minimize the Bethe and Kikuchi Free Energies: Convergent Alternatives to Belief Propagation
  • A. Yuille
  • Computer Science
    Neural Computation
  • 2002
TLDR
A class of discrete iterative algorithms that are provably convergent alternatives to believe propagation (BP) and generalized belief propagation (GBP) and are pointed out that have a large range of inference and learning applications.
Soft Margins for AdaBoost
TLDR
It is found that ADABOOST asymptotically achieves a hard margin distribution, i.e. the algorithm concentrates its resources on a few hard-to-learn patterns that are interestingly very similar to Support Vectors.
Approximate inference techniques with expectation constraints
TLDR
A unified view of several recently proposed approximation schemes is presented, shown to be related to Bethe free energies with weak consistency constraints, i.e. free energies where local approximations are only required to agree on certain statistics instead of full marginals.
Assessing Approximate Inference for Binary Gaussian Process Classification
TLDR
This work reviews and compares Laplace's method and Expectation Propagation for approximate Bayesian inference in the binary Gaussian process classification model, and presents a comprehensive comparison of the approximations, their predictive performance and marginal likelihood estimates to results obtained by MCMC sampling.
Graphical Models, Exponential Families, and Variational Inference
TLDR
The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in large-scale statistical models.
Bayesian invariant measurements of generalisation for continuous distributions
TLDR
A family of measurements of generalisation is proposed for estimators of continuous distributions that apply to neural network learning rules associated with continuous neural networks and is applied to the family of Gaussian distributions.
...
1
2
...