• Corpus ID: 220280768

# Belief Propagation Neural Networks

@article{Kuck2020BeliefPN,
title={Belief Propagation Neural Networks},
author={Jonathan Kuck and Shuvam Chakraborty and Hao Tang and Rachel Luo and Jiaming Song and Ashish Sabharwal and Stefano Ermon},
journal={ArXiv},
year={2020},
volume={abs/2007.00295}
}
• Published 1 July 2020
• Computer Science
• ArXiv
Learned neural solvers have successfully been used to solve combinatorial optimization and decision problems. More general counting variants of these problems, however, are still largely solved with hand-crafted solvers. To bridge this gap, we introduce belief propagation neural networks (BPNNs), a class of parameterized operators that operate on factor graphs and generalize Belief Propagation (BP). In its strictest form, a BPNN layer (BPNN-D) is a learned iterative operator that provably…

## Figures and Tables from this paper

### Deep Attentive Belief Propagation: Integrating Reasoning and Learning for Solving Constraint Optimization Problems

• Computer Science
• 2022
This work proposes a novel self-supervised learning algorithm for DABP with a smoothed solution cost, which does not require expensive training labels and also avoids the common out-of-distribution issue through efﬁcient online learning.

### Deep learning via message passing algorithms based on belief propagation

• Computer Science
Mach. Learn. Sci. Technol.
• 2022
This paper presents and adapt to mini-batch training on GPUs a family of BP-based message-passing algorithms with a reinforcement term that biases distributions towards locally entropic solutions, capable of training multi-layer neural networks with performance comparable to SGD heuristics in a diverse set of experiments on natural datasets.

### Variational message passing neural network for Maximum-A-Posteriori (MAP) inference

• Computer Science
UAI
• 2022
A variational message passing neural network (V-MPNN), where both the power of neural networks in modeling complex functions and the well-established algorithmic theories on variational belief propagation are leveraged.

### NSNet: A General Neural Probabilistic Framework for Satisfiability Problems

• Computer Science
• 2022
A general neural framework for solving satisﬁability problems as probabilistic inference that outperforms BP and other neural baselines and achieves competitive results compared with the state-of-the-art solvers.

### Graph Neural Networks for Propositional Model Counting

• Computer Science
ESANN 2022 proceedings
• 2022
This work presents an architecture based on the GNN framework for belief propagation of [15], extended with self-attentive GNN and trained to approximately solve the #SAT problem, showing that this model is able to scale effectively to much larger problem sizes, with comparable or better performances of state of the art approximate solvers.

### A visual introduction to Gaussian Belief Propagation

• Computer Science
ArXiv
• 2021
This article presents a visual introduction to Gaussian Belief Propagation, an approximate probabilistic inference algorithm that operates by passing messages between the nodes of arbitrarily structured factor graphs that has the right computational properties to act as a scalable distributed probabilism inference framework for future machine learning systems.

### Robust Deep Learning from Crowds with Belief Propagation

• Computer Science
AISTATS
• 2022
A neural-powered Bayesian framework is established, from which deepMF and deepBP are devise with diﬀerent choice of variational approximation methods, mean ﬁeld (MF) and belief propagation (BP), respectively, which provides a uniﬁed view of existing methods, which are special cases of deepMF with di-erent priors.

### Learning Feasibility of Factored Nonlinear Programs in Robotic Manipulation Planning

• Computer Science
ArXiv
• 2022
The model is trained with a dataset of labeled subgraphs of Factored- NLPs, and importantly, can make useful predictions on larger factored nonlinear programs than the ones seen during training, which is important for robotic manipulation planning.

### Neural Belief Propagation for Scene Graph Generation

• Computer Science
ArXiv
• 2021
A novel neural belief propagation method that employs a structural Bethe approximation rather than the mean field approximation to infer the associated marginals and achieves the state-of-the-art performance on various popular scene graph generation benchmarks.

### Equivariant Neural Network for Factor Graphs

• Computer Science
ArXiv
• 2021
This paper precisely characterize these isomorphic properties of factor graphs and proposes two inference models: FactorEquivariant Neural Belief Propagation (FE-NBP and FE-GNN), a neural network that generalizes BP and respects each of the above properties.

## References

SHOWING 1-10 OF 54 REFERENCES

### Fast Convergence of Belief Propagation to Global Optima: Beyond Correlation Decay

BP converges quickly to the global optimum of the Bethe free energy for Ising models on arbitrary graphs, as long as the Ising model is \emph{ferromagnetic} (i.e. neighbors prefer to be aligned).

### Learning to Pass Expectation Propagation Messages

• Computer Science
NIPS
• 2013
This work studies whether it is possible to automatically derive fast and accurate EP updates by learning a discriminative model to map EP message inputs to EP message outputs, and provides empirical analysis on several challenging and diverse factors, indicating that there is a space of factors where this approach appears promising.

### Learning to Solve NP-Complete Problems - A Graph Neural Network for the Decision TSP

• Computer Science
AAAI
• 2019
This paper shows that GNNs can learn to solve the decision variant of the Traveling Salesperson Problem (TSP), a highly relevant $\mathcal{NP}$-Complete problem.

### Adam: A Method for Stochastic Optimization

• Computer Science
ICLR
• 2015
This work introduces Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, and provides a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework.

### Constructing free-energy approximations and generalized belief propagation algorithms

• Computer Science
IEEE Transactions on Information Theory
• 2005
This work explains how to obtain region-based free energy approximations that improve the Bethe approximation, and corresponding generalized belief propagation (GBP) algorithms, and describes empirical results showing that GBP can significantly outperform BP.

### Learning to Reason: Leveraging Neural Networks for Approximate DNF Counting

• Computer Science
AAAI
• 2020
This paper proposes a neural model counting approach for weighted #DNF that combines approximate model counting with deep learning, and accurately approximates model counts in linear time when width is bounded.

### Hashing-Based Approximate Probabilistic Inference in Hybrid Domains

• Computer Science
UAI
• 2015
This work shows how probabilistic inference in hybrid domains can be put within reach of hashing-based WMC solvers and builds on a notion called weighted model integration, which is a strict generalization of WMC.

### Graphical Models, Exponential Families, and Variational Inference

• Computer Science
Found. Trends Mach. Learn.
• 2008
The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in large-scale statistical models.

### Amortized Bethe Free Energy Minimization for Learning MRFs

• Computer Science
NeurIPS
• 2019
This work optimize a saddle-point objective deriving from the Bethe free energy approximation to the partition function, which requires no sampling, and can be efficiently computed even for very expressive MRFs.