• Corpus ID: 227127345

Lightweight Data Fusion with Conjugate Mappings

  title={Lightweight Data Fusion with Conjugate Mappings},
  author={Christopher Dean and Stephen J. Lee and Jason L. Pacheco and John W. Fisher III},
We present an approach to data fusion that combines the interpretability of structured probabilistic graphical models with the flexibility of neural networks. The proposed method, lightweight data fusion (LDF), emphasizes posterior analysis over latent variables using two types of information: primary data, which are well-characterized but with limited availability, and auxiliary data, readily available but lacking a well-characterized statistical relationship to the latent quantity of interest… 
Lighting the Way: Nighttime Lights for Electrification Planning
Nighttime lights (NTL) satellite imagery provides a unique perspective on human activity (Fig. 1). This data is particularly useful because it reflects consistent measurements, is often very low or


Bayesian Deep Net GLM and GLMM
This work describes flexible versions of generalized linear and generalized linear mixed models incorporating basis functions formed by a DFNN, and proposes natural gradient methods for the optimization, exploiting the factor structure of the variational covariance matrix in computation of the natural gradient.
Auto-Encoding Variational Bayes
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.
Composing graphical models with neural networks for structured representations and fast inference
A general modeling and inference framework that composes probabilistic graphical models with deep learning methods and combines their respective strengths is proposed, giving a scalable algorithm that leverages stochastic variational inference, natural gradients, graphical model message passing, and the reparameterization trick.
Towards a Neural Statistician
An extension of a variational autoencoder that can learn a method for computing representations, or statistics, of datasets in an unsupervised fashion is demonstrated that is able to learn statistics that can be used for clustering datasets, transferring generative models to new datasets, selecting representative samples of datasets and classifying previously unseen classes.
Variational Learning in Nonlinear Gaussian Belief Networks
This article presents a general variational method that maximizes a lower bound on the likelihood of a training set and gives results on two visual feature extraction problems.
Generative Adversarial Nets
We propose a new framework for estimating generative models via an adversarial process, in which we simultaneously train two models: a generative model G that captures the data distribution, and a
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
A new theoretical framework is developed casting dropout training in deep neural networks (NNs) as approximate Bayesian inference in deep Gaussian processes, which mitigates the problem of representing uncertainty in deep learning without sacrificing either computational complexity or test accuracy.
Graphical Models, Exponential Families, and Variational Inference
The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in large-scale statistical models.
Multimodal learning with deep Boltzmann machines
A Deep Boltzmann Machine is proposed for learning a generative model of multimodal data and it is shown that the model can be used to create fused representations by combining features across modalities, which are useful for classification and information retrieval.
Statistical and Information-Theoretic Methods for Self-Organization and Fusion of Multimodal, Networked Sensors
A statistical viewpoint of networks of low-power and low-cost sensors is adopted, and it is shown how the fusion method previously developed can be used to find correspondences between pairs of long-baseline sensors.