Discrete independent component analysis (DICA) with belief propagation

@article{Palmieri2015DiscreteIC,
  title={Discrete independent component analysis (DICA) with belief propagation},
  author={Francesco Palmieri and Amedeo Buonanno},
  journal={2015 IEEE 25th International Workshop on Machine Learning for Signal Processing (MLSP)},
  year={2015},
  pages={1-6}
}
  • F. Palmieri, A. Buonanno
  • Published 2015
  • Computer Science, Mathematics
  • 2015 IEEE 25th International Workshop on Machine Learning for Signal Processing (MLSP)
We apply belief propagation to a Bayesian bipartite graph composed of discrete independent hidden variables and discrete visible variables. The network is the Discrete counterpart of Independent Component Analysis (DICA) and it is manipulated in a factor graph form for inference and learning. A full set of simulations is reported for character images from the MNIST dataset. The results show that the factorial code implemented by the sources contributes to build a good generative model for the… Expand
COMPUTATIONAL OPTIMIZATION FOR NORMAL FORM REALIZATION OF BAYESIAN MODEL GRAPHS
TLDR
New algorithms are proposed and a library that allows a significant reduction in costs with respect to direct use of the standard sums-products and Maximum Likelihood (ML) learning is created. Expand
Optimized Realization of Bayesian Networks in Reduced Normal Form using Latent Variable Model
TLDR
Through a detailed algorithmic and structural analysis, various solutions for cost reduction are proposed, and an online version of the classic batch learning algorithm is analyzed, showing very similar results (in an unsupervised context); which is essential even if multilevel structures are to be built. Expand
A Comparison of Algorithms for Learning Hidden Variables in Bayesian Factor Graphs in Reduced Normal Form
  • F. Palmieri
  • Computer Science, Mathematics
  • IEEE Transactions on Neural Networks and Learning Systems
  • 2016
TLDR
Factor graphs in reduced normal form provide an appealing framework for rapid deployment of Bayesian-directed graphs in the applications and are compared with two other updating equations based on localized decisions and on a variational approximation. Expand
Context Analysis Using a Bayesian Normal Graph
TLDR
This work demonstrates how the Latent Variable Model, expressed as a Factor Graph in Reduced Normal Form, can manage contextual information to support a scene understanding task. Expand

References

SHOWING 1-10 OF 24 REFERENCES
A Comparison of Algorithms for Learning Hidden Variables in Normal Graphs
TLDR
This paper provides the programmer with explicit algorithms for rapid deployment of Bayesian graphs in the applications from a constrained maximum likelihood (ML) formulation and from a minimum KL-divergence criterion using KKT conditions. Expand
Discrete Component Analysis
TLDR
A unified theory for analysis of components in discrete data is presented, and the methods compared with techniques such as independent component analysis, non-negative matrix factorisation and latent Dirichlet allocation are compared. Expand
Learning Non-Linear Functions With Factor Graphs
  • F. Palmieri
  • Computer Science, Mathematics
  • IEEE Transactions on Signal Processing
  • 2013
TLDR
A scheme for embedding soft quantization in a probabilistic Bayesian graph that can easily merge discrete and continuous variables, is proposed and demonstrated with examples and simulations. Expand
Simulink Implementation of Belief Propagation in Normal Factor Graphs
A Simulink Library for rapid prototyping of belief network architectures using Forney-style Factor Graph is presented. Our approach allows to draw complex architectures in a fairly easy way giving toExpand
Independent Component Analysis
TLDR
The standardization of the IC model is talked about, and on the basis of n independent copies of x, the aim is to find an estimate of an unmixing matrix Γ such that Γx has independent components. Expand
Independent Factor Analysis
  • H. Attias
  • Mathematics, Medicine
  • Neural Computation
  • 1999
TLDR
An expectation-maximization (EM) algorithm is presented, which performs unsupervised learning of an associated probabilistic model of the mixing situation and is shown to be superior to ICA since it can learn arbitrary source densities from the data. Expand
An introduction to factor graphs
  • H. Loeliger
  • Computer Science
  • IEEE Signal Processing Magazine
  • 2004
TLDR
This work uses Forney-style factor graphs, which support hierarchical modeling and are compatible with standard block diagrams, and uses them to derive practical detection/estimation algorithms in a wide area of applications. Expand
Statistical Models of Natural Images and Cortical Visual Representation
TLDR
This work has shown that linear sparse coding, which is equivalent to independent component analysis (ICA), and provided a very good description of the receptive fields of simple cells, can be applied to the response properties of neurons. Expand
Graphical Models
Statistical applications in fields such as bioinformatics, information retrieval, speech processing, image processing and communications often involve large-scale models in which thousands orExpand
Natural Image Statistics - A Probabilistic Approach to Early Computational Vision
TLDR
This book is the first comprehensive introduction to the multidisciplinary field of natural image statistics and explains both the basic theory and the most recent advances in a coherent and user-friendly manner. Expand
...
1
2
3
...