d-VMP: Distributed Variational Message Passing

@inproceedings{Masegosa2016dVMPDV,
  title={d-VMP: Distributed Variational Message Passing},
  author={Andr{\'e}s R. Masegosa and Ana M. Mart{\'i}nez and Helge Langseth and Thomas D. Nielsen and Antonio Salmer{\'o}n and Dar{\'i}o Ramos-L{\'o}pez and Anders L. Madsen},
  booktitle={Probabilistic Graphical Models},
  year={2016}
}
Motivated by a real-world financial dataset, we propose a distributed variational message passing scheme for learning conjugate exponential models. We show that the method can be seen as a projected natural gradient ascent algorithm, and it therefore has good convergence properties. This is supported experimentally, where we show that the approach is robust wrt. common problems like imbalanced data, heavy-tailed empirical distributions, and a high degree of missing values. The scheme is based… CONTINUE READING

Similar Papers

Loading similar papers…