• Corpus ID: 245502578

Reactive Message Passing for Scalable Bayesian Inference

@article{Bagaev2021ReactiveMP,
  title={Reactive Message Passing for Scalable Bayesian Inference},
  author={Dmitry Bagaev and Bert de Vries},
  journal={ArXiv},
  year={2021},
  volume={abs/2112.13251}
}
We introduce Reactive Message Passing (RMP) as a framework for executing schedule-free, robust and scalable message passing-based inference in a factor graph representation of a probabilistic model. RMP is based on the reactive programming style that only describes how nodes in a factor graph react to changes in connected nodes. The absence of a fixed message passing schedule improves robustness, scalability and execution time of the inference procedure. We also present ReactiveMP.jl, which is… 
Active Inference and Epistemic Value in Graphical Models
TLDR
It is concluded that CBFE optimization by message passing suggests a general mechanism for epistemic-aware AIF in free-form generative models, similar to how an EFE agent incurs expected reward in significantly more environmental scenarios.
AIDA: An Active Inference-Based Design Agent for Audio Processing Algorithms
TLDR
An active inference-based agent that iteratively designs a personalized audio processing algorithm through situated interactions with a human client and a novel generative model for acoustic signals as a sum of time-varying auto-regressive filters and a user response model based on a Gaussian Process Classifier.

References

SHOWING 1-10 OF 43 REFERENCES
Residual Belief Propagation: Informed Scheduling for Asynchronous Message Passing
TLDR
RBP is proposed, a novel, easy-to-implement, asynchronous propagation algorithm that schedules messages in an informed way that pushes down a bound on the distance from the fixed point and demonstrates the superiority of RBP over state-of-the-art methods for a variety of challenging synthetic and real-life problems.
Reparameterization Gradient Message Passing
TLDR
Reparameterized Gradient Message Passing is introduced, which is a new message passing method based on the reparameterization gradient, which will argue that this kind of hybrid message passing leads naturally to low-variance gradients.
Extended Variational Message Passing for Automated Approximate Bayesian Inference
TLDR
It is shown that executing VMP in complex models relies on the ability to compute the expectations of the statistics of hidden variables, and the proposed Extended VMP (EVMP) approach supports automated efficient inference for a very wide range of probabilistic model specifications.
Robust Expectation Propagation in Factor Graphs Involving Both Continuous and Binary Variables
TLDR
The probit factor node for linking continuous and binary random variables in a factor graph is described and derived through constrained moment matching, which leads to a robust version of the EP algorithm in which all messages are guaranteed to be proper.
Variational Message Passing and its Applications
TLDR
This thesis is concerned with the development of Variational Message Passing, an algorithm for automatically performing variational inference in a probabilistic graphical model that is an analog of belief propagation that uses message passing within a graphical model to optimise an approximate variational distribution.
Unifying Message Passing Algorithms Under the Framework of Constrained Bethe Free Energy Minimization
TLDR
This article unifiesVariational message passing, belief propagation and expectation propagation under an optimization framework, namely, Bethe free energy minimization with differently and appropriately imposed constraints, to find a theoretical framework to systematically derive message passing variants.
Reactive probabilistic programming
TLDR
ProbZelus conservatively provides the facilities of a synchronous language to write control software, with probabilistic constructs to model uncertainties and perform inference-in-the-loop, and is presented the design and implementation of the language.
PushNet: Efficient and Adaptive Neural Message Passing
TLDR
This work considers a novel asynchronous message passing approach where information is pushed only along the most relevant edges until convergence and can equivalently be formulated as a single synchronous message passing iteration using a suitable neighborhood function, thus sharing the advantages of existing methods while addressing their central issues.
On Variational Message Passing on Factor Graphs
  • J. Dauwels
  • Computer Science
    2007 IEEE International Symposium on Information Theory
  • 2007
In this paper, it is shown how (naive and structured) variational algorithms may be derived from a factor graph by mechanically applying generic message computation rules; in this way, one can bypass
Message Passing-Based Inference in the Gamma Mixture Model
TLDR
This paper presents two variants of variational message passing-based inference in a Gamma mixture model that use moment matching and alternatively expectation-maximization to approximate the posterior distributions.
...
...