# Optimized Realization of Bayesian Networks in Reduced Normal Form using Latent Variable Model

@article{Gennaro2021OptimizedRO, title={Optimized Realization of Bayesian Networks in Reduced Normal Form using Latent Variable Model}, author={Giovanni Di Gennaro and Amedeo Buonanno and Francesco Palmieri}, journal={Soft Comput.}, year={2021}, volume={25}, pages={7029-7040} }

Bayesian networks in their Factor Graph Reduced Normal Form are a powerful paradigm for implementing inference graphs. Unfortunately, the computational and memory costs of these networks may be considerable even for relatively small networks, and this is one of the main reasons why these structures have often been underused in practice. In this work, through a detailed algorithmic and structural analysis, various solutions for cost reduction are proposed. Moreover, an online version of the… Expand

#### Supplemental Code

Github Repo

Library in C++ for the optimized design of Bayesian networks using the FGrn paradigm.

#### One Citation

A Unified View of Algorithms for Path Planning Using Probabilistic Inference on Factor Graphs

- Computer Science, Mathematics
- ArXiv
- 2021

This work starts by posing the path planning problem on a probabilistic factor graph, and shows how the various algorithms translate into specific message composition rules, and provides a very general framework that includes the Sum- product, the Max-product, Dynamic programming and mixed Reward/Entropy criteriabased algorithms. Expand

#### References

SHOWING 1-10 OF 30 REFERENCES

COMPUTATIONAL OPTIMIZATION FOR NORMAL FORM REALIZATION OF BAYESIAN MODEL GRAPHS

- Computer Science
- 2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP)
- 2018

New algorithms are proposed and a library that allows a significant reduction in costs with respect to direct use of the standard sums-products and Maximum Likelihood (ML) learning is created. Expand

A Comparison of Algorithms for Learning Hidden Variables in Bayesian Factor Graphs in Reduced Normal Form

- Computer Science, Mathematics
- IEEE Transactions on Neural Networks and Learning Systems
- 2016

Factor graphs in reduced normal form provide an appealing framework for rapid deployment of Bayesian-directed graphs in the applications and are compared with two other updating equations based on localized decisions and on a variational approximation. Expand

Towards Building Deep Networks with Bayesian Factor Graphs

- Computer Science
- ArXiv
- 2015

A Multi-Layer Network based on the Bayesian framework of the Factor Graphs in Reduced Normal Form applied to a two-dimensional lattice is proposed, demonstrated in this paper in a three-layer structure applied to images extracted from a standard data set. Expand

Latent Variable Models

- 1999

A powerful approach to probabilistic modelling involves supplementing a set of observed variables with additional latent, or hidden, variables. By defining a joint distribution over visible and… Expand

Discrete independent component analysis (DICA) with belief propagation

- Computer Science, Mathematics
- 2015 IEEE 25th International Workshop on Machine Learning for Signal Processing (MLSP)
- 2015

The results show that the factorial code implemented by the sources contributes to build a good generative model for the data that can be used in various inference modes. Expand

Two-dimensional multi-layer Factor Graphs in Reduced Normal Form

- Computer Science
- 2015 International Joint Conference on Neural Networks (IJCNN)
- 2015

A multi-layer architecture using the Bayesian framework of the Factor Graphs in Reduced Normal Form that implements a hierarchical data representation that via belief propagation can be used for learning and inference in pattern completion, correction and classification. Expand

Probabilistic Graphical Models - Principles and Techniques

- Computer Science
- 2009

The framework of probabilistic graphical models, presented in this book, provides a general approach for causal reasoning and decision making under uncertainty, allowing interpretable models to be constructed and then manipulated by reasoning algorithms. Expand

An introduction to factor graphs

- Computer Science
- IEEE Signal Processing Magazine
- 2004

This work uses Forney-style factor graphs, which support hierarchical modeling and are compatible with standard block diagrams, and uses them to derive practical detection/estimation algorithms in a wide area of applications. Expand

Belief propagation and learning in convolution multi-layer factor graphs

- Computer Science, Mathematics
- 2014 4th International Workshop on Cognitive Information Processing (CIP)
- 2014

This work presents an approach to learn a layered factor graph architecture starting from a stationary latent models for each layer of convolution multi-layer graphs. Expand

Codes on graphs: Normal realizations

- Mathematics, Computer Science
- IEEE Trans. Inf. Theory
- 2001

Any state realization of a code can be put into normal form without essential change in the corresponding graph or in its decoding complexity; this fundamental result has many applications, including to dual state spaces, dual minimal trellises, duals to Tanner (1981) graphs, dual input/output (I/O) systems, and dual kernel and image representations. Expand