• Corpus ID: 174802676

Combining Generative and Discriminative Models for Hybrid Inference

@article{Satorras2019CombiningGA,
  title={Combining Generative and Discriminative Models for Hybrid Inference},
  author={Victor Garcia Satorras and Zeynep Akata and Max Welling},
  journal={ArXiv},
  year={2019},
  volume={abs/1906.02547}
}
A graphical model is a structured representation of the data generating process. The traditional method to reason over random variables is to perform inference in this graphical model. However, in many cases the generating process is only a poor approximation of the much more complex true data generating process, leading to suboptimal estimation. The subtleties of the generative process are however captured in the data itself and we can `learn to infer', that is, learn a direct mapping from… 

Figures and Tables from this paper

Self-Supervised Hybrid Inference in State-Space Models

TLDR
Despite the model’s simplicity, it obtains competitive results on the chaotic Lorenz system compared to a fully supervised approach and outperform a method based on variational inference.

Inference from Stationary Time Sequences via Learned Factor Graphs

TLDR
An inference algorithm based on learned stationary factor graphs, referred to as StaSPNet, is presented, which learns to implement the sum product scheme from labeled data, and can be applied to sequences of different lengths.

Neural Enhanced Belief Propagation on Factor Graphs

TLDR
This work proposes a new hybrid model that runs conjointly a FG-GNN with belief propagation and applies the ideas to error correction decoding tasks, and shows that the algorithm can outperform belief propagation for LDPC codes on bursty channels.

Self-Supervised Inference in State-Space Models

TLDR
This work performs approximate inference in state-space models with nonlinear state transitions using a local linearity approximation parameterized by neural networks, accompanied by a maximum likelihood objective that requires no supervision via uncorrupt observations or ground truth latent states.

Adversarially-learned Inference via an Ensemble of Discrete Undirected Graphical Models

TLDR
This work proposes an inference-agnostic adversarial training framework for producing an ensemble of graphical models (AGMs), which is optimized to generate data, and inference is learned as a by-product of this endeavor.

Neural Structured Prediction for Inductive Node Classification

TLDR
Inspired by the underlying connection between joint and marginal distributions by Markov networks, this paper proposes to solve an approximate version of the optimization problem as a proxy, which yields a near-optimal solution, making learning morecient.

Hybrid Predictive Coding: Inferring, Fast and Slow

TLDR
This work proposes a hybrid predictive coding network that combines both iterative and amortized inference in a principled manner by describing both in terms of a dual optimization of a single objective function and demonstrates that the resulting scheme can be implemented in a biologically plausible neural architecture that approximates Bayesian inference utilising local Hebbian update rules.

Learning Dynamics and Structure of Complex Systems Using Graph Neural Networks

TLDR
This work trained graph neural networks to fit time series from an example nonlinear dynamical system, the belief propagation algorithm, and identified a ‘graph translator’ between the statistical interactions in belief propagation and parameters of the corresponding trained network.

Control as Hybrid Inference

TLDR
This work presents an implementation of CHI which naturally mediates the balance between iterative and amortised inference, and provides a principled framework for harnessing the sample efficiency of model-based planning while retaining the asymptotic performance ofmodel-free policy optimisation.

Stanza: A Nonlinear State Space Model for Probabilistic Inference in Non-Stationary Time Series

TLDR
Stanza strikes a balance between competitive forecasting accuracy and probabilistic, interpretable inference for highly structured time series, achieving forecasting accuracy competitive with deep LSTMs on real-world datasets, especially for multi-step ahead forecasting.

References

SHOWING 1-10 OF 38 REFERENCES

Composing graphical models with neural networks for structured representations and fast inference

TLDR
A general modeling and inference framework that composes probabilistic graphical models with deep learning methods and combines their respective strengths is proposed, giving a scalable algorithm that leverages stochastic variational inference, natural gradients, graphical model message passing, and the reparameterization trick.

Inference in Probabilistic Graphical Models by Graph Neural Networks

TLDR
This work uses Graph Neural Networks (GNNs) to learn a message-passing algorithm that solves inference tasks and demonstrates the efficacy of this inference approach by training GNNs on a collection of graphical models and showing that they substantially outperform belief propagation on loopy graphs.

Auto-Encoding Variational Bayes

TLDR
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.

Iterative Amortized Inference

TLDR
This work proposes iterative inference models, which learn to perform inference optimization through repeatedly encoding gradients, and demonstrates the inference optimization capabilities of these models and shows that they outperform standard inference models on several benchmark data sets of images and text.

Backprop KF: Learning Discriminative Deterministic State Estimators

TLDR
This work presents an alternative approach where the parameters of the latent state distribution are directly optimized as a deterministic computation graph, resulting in a simple and effective gradient descent algorithm for training discriminative state estimators.

Recurrent Inference Machines for Solving Inverse Problems

TLDR
This work proposes a learning framework, called Recurrent Inference Machines (RIM), in which it turns algorithm construction the other way round: Given data and a task, train an RNN to learn an inference algorithm.

Structure Inference Machines: Recurrent Neural Networks for Analyzing Relations in Group Activity Recognition

TLDR
A method to integrate graphical models and deep neural networks into a joint framework that uses a sequential inference modeled by a recurrent neural network and demonstrates the potential of this model to handle highly structured learning tasks.

Convolutional Pose Machines

TLDR
This work designs a sequential architecture composed of convolutional networks that directly operate on belief maps from previous stages, producing increasingly refined estimates for part locations, without the need for explicit graphical model-style inference in structured prediction tasks such as articulated pose estimation.

Long Short-Term Memory Kalman Filters: Recurrent Neural Estimators for Pose Regularization

TLDR
This work proposes to learn rich, dynamic representations of the motion and noise models from data using long shortterm memory, which allows representations that depend on all previous observations and all previous states.

Markov Chain Monte Carlo and Variational Inference: Bridging the Gap

TLDR
A new synthesis of variational inference and Monte Carlo methods where one or more steps of MCMC is incorporated into the authors' variational approximation, resulting in a rich class of inference algorithms bridging the gap between variational methods and MCMC.