Optimized Realization of Bayesian Networks in Reduced Normal Form using Latent Variable Model

@article{Gennaro2021OptimizedRO,
  title={Optimized Realization of Bayesian Networks in Reduced Normal Form using Latent Variable Model},
  author={Giovanni Di Gennaro and Amedeo Buonanno and Francesco Palmieri},
  journal={Soft Comput.},
  year={2021},
  volume={25},
  pages={7029-7040}
}
Bayesian networks in their Factor Graph Reduced Normal Form are a powerful paradigm for implementing inference graphs. Unfortunately, the computational and memory costs of these networks may be considerable even for relatively small networks, and this is one of the main reasons why these structures have often been underused in practice. In this work, through a detailed algorithmic and structural analysis, various solutions for cost reduction are proposed. Moreover, an online version of the… Expand
A Unified View of Algorithms for Path Planning Using Probabilistic Inference on Factor Graphs
TLDR
This work starts by posing the path planning problem on a probabilistic factor graph, and shows how the various algorithms translate into specific message composition rules, and provides a very general framework that includes the Sum- product, the Max-product, Dynamic programming and mixed Reward/Entropy criteriabased algorithms. Expand

References

SHOWING 1-10 OF 30 REFERENCES
COMPUTATIONAL OPTIMIZATION FOR NORMAL FORM REALIZATION OF BAYESIAN MODEL GRAPHS
TLDR
New algorithms are proposed and a library that allows a significant reduction in costs with respect to direct use of the standard sums-products and Maximum Likelihood (ML) learning is created. Expand
A Comparison of Algorithms for Learning Hidden Variables in Bayesian Factor Graphs in Reduced Normal Form
  • F. Palmieri
  • Computer Science, Mathematics
  • IEEE Transactions on Neural Networks and Learning Systems
  • 2016
TLDR
Factor graphs in reduced normal form provide an appealing framework for rapid deployment of Bayesian-directed graphs in the applications and are compared with two other updating equations based on localized decisions and on a variational approximation. Expand
Towards Building Deep Networks with Bayesian Factor Graphs
TLDR
A Multi-Layer Network based on the Bayesian framework of the Factor Graphs in Reduced Normal Form applied to a two-dimensional lattice is proposed, demonstrated in this paper in a three-layer structure applied to images extracted from a standard data set. Expand
Latent Variable Models
A powerful approach to probabilistic modelling involves supplementing a set of observed variables with additional latent, or hidden, variables. By defining a joint distribution over visible andExpand
Discrete independent component analysis (DICA) with belief propagation
  • F. Palmieri, A. Buonanno
  • Computer Science, Mathematics
  • 2015 IEEE 25th International Workshop on Machine Learning for Signal Processing (MLSP)
  • 2015
TLDR
The results show that the factorial code implemented by the sources contributes to build a good generative model for the data that can be used in various inference modes. Expand
Two-dimensional multi-layer Factor Graphs in Reduced Normal Form
TLDR
A multi-layer architecture using the Bayesian framework of the Factor Graphs in Reduced Normal Form that implements a hierarchical data representation that via belief propagation can be used for learning and inference in pattern completion, correction and classification. Expand
Probabilistic Graphical Models - Principles and Techniques
TLDR
The framework of probabilistic graphical models, presented in this book, provides a general approach for causal reasoning and decision making under uncertainty, allowing interpretable models to be constructed and then manipulated by reasoning algorithms. Expand
An introduction to factor graphs
  • H. Loeliger
  • Computer Science
  • IEEE Signal Processing Magazine
  • 2004
TLDR
This work uses Forney-style factor graphs, which support hierarchical modeling and are compatible with standard block diagrams, and uses them to derive practical detection/estimation algorithms in a wide area of applications. Expand
Belief propagation and learning in convolution multi-layer factor graphs
  • F. Palmieri, A. Buonanno
  • Computer Science, Mathematics
  • 2014 4th International Workshop on Cognitive Information Processing (CIP)
  • 2014
TLDR
This work presents an approach to learn a layered factor graph architecture starting from a stationary latent models for each layer of convolution multi-layer graphs. Expand
Codes on graphs: Normal realizations
  • G. Forney
  • Mathematics, Computer Science
  • IEEE Trans. Inf. Theory
  • 2001
TLDR
Any state realization of a code can be put into normal form without essential change in the corresponding graph or in its decoding complexity; this fundamental result has many applications, including to dual state spaces, dual minimal trellises, duals to Tanner (1981) graphs, dual input/output (I/O) systems, and dual kernel and image representations. Expand
...
1
2
3
...