• Corpus ID: 4622742

Neural Autoregressive Flows

@article{Huang2018NeuralAF,
  title={Neural Autoregressive Flows},
  author={Chin-Wei Huang and David Krueger and Alexandre Lacoste and Aaron C. Courville},
  journal={ArXiv},
  year={2018},
  volume={abs/1804.00779}
}
Normalizing flows and autoregressive models have been successfully combined to produce state-of-the-art results in density estimation, via Masked Autoregressive Flows (MAF), and to accelerate state-of-the-art WaveNet-based speech synthesis to 20x faster than real-time, via Inverse Autoregressive Flows (IAF. [] Key Result Experimentally, NAF yields state-of-the-art performance on a suite of density estimation tasks and outperforms IAF in variational autoencoders trained on binarized MNIST.
Block Neural Autoregressive Flow
Normalising flows (NFS) map two density functions via a differentiable bijection whose Jacobian determinant can be computed efficiently. Recently, as an alternative to hand-crafted bijections, Huang
Quasi-Autoregressive Residual (QuAR) Flows
TLDR
This paper introduces a simplification to residual flows using a Quasi-Autoregressive (QuAR) approach, which retains many of the benefits of residual flows while dramatically reducing the compute time and memory requirements, thus making flow-based modeling approaches far more tractable and broadening their potential applicability.
Data-driven Estimation of Background Distribution through Neural Autoregressive Flows
TLDR
A general and automatic data-driven background distribution shape estimation method using neural autoregressive flows (NAF), which is one of the deep generative learning methods, and it is demonstrated that the prediction through ABCDnn method is similar to optimal case, while having smaller statistical uncertainty.
Unconstrained Monotonic Neural Networks
TLDR
This work proposes the Unconstrained Monotonic Neural Network (UMNN) architecture based on the insight that a function is monotonic as long as its derivative is strictly positive and demonstrates the ability of UMNNs to improve variational inference.
Towards Recurrent Autoregressive Flow Models
TLDR
This work presents Recurrent Autoregressive Flows as a method toward general stochastic process modeling with normalizing flows and presents an initial design for a recurrent flow cell and a method to train the model to match observed empirical distributions.
Causal Autoregressive Flows
TLDR
This work highlights an intrinsic correspondence between a simple family of flows and identifiable causal models, and derives a bivariate measure of causal direction based on likelihood ratios, leveraging the fact that flow models estimate normalized log-densities of data.
Latent Normalizing Flows for Discrete Sequences
TLDR
A VAE-based generative model is proposed which jointly learns a normalizing flow-based distribution in the latent space and a stochastic mapping to an observed discrete space in this setting, finding that it is crucial for the flow- based distribution to be highly multimodal.
Cubic-Spline Flows
TLDR
This work stacks a new coupling transform, based on monotonic cubic splines, with LU-decomposed linear layers, which retains an exact one-pass inverse, can be used to generate high-quality images, and closes the gap with autoregressive flows on a suite of density-estimation tasks.
Flow++: Improving Flow-Based Generative Models with Variational Dequantization and Architecture Design
TLDR
Flow++ is proposed, a new flow-based model that is now the state-of-the-art non-autoregressive model for unconditional density estimation on standard image benchmarks, and has begun to close the significant performance gap that has so far existed between autoregressive models and flow- based models.
Stochastic Neural Network with Kronecker Flow
TLDR
This work presents the Kronecker Flow, a generalization of the KrOnecker product to invertible mappings designed for stochastic neural networks, and applies this method to variational Bayesian neural networks on predictive tasks, PAC-Bayes generalization bound estimation, and approximate Thompson sampling in contextual bandits.
...
...

References

SHOWING 1-10 OF 44 REFERENCES
Improved Variational Inference with Inverse Autoregressive Flow
TLDR
A new type of normalizing flow, inverse autoregressive flow (IAF), is proposed that, in contrast to earlier published flows, scales well to high-dimensional latent spaces and significantly improves upon diagonal Gaussian approximate posteriors.
Transformation Autoregressive Networks
TLDR
This work attempts to systematically characterize methods for density estimation, and proposes multiple novel methods to model non-Markovian dependencies, and introduces a novel data driven framework for learning a family of distributions.
Masked Autoregressive Flow for Density Estimation
TLDR
This work describes an approach for increasing the flexibility of an autoregressive model, based on modelling the random numbers that the model uses internally when generating data, which is called Masked Autoregressive Flow.
MADE: Masked Autoencoder for Distribution Estimation
TLDR
This work introduces a simple modification for autoencoder neural networks that yields powerful generative models and proves that this approach is competitive with state-of-the-art tractable distribution estimators.
Improving Variational Auto-Encoders using convex combination linear Inverse Autoregressive Flow
TLDR
The idea is to enrich a linear Inverse Autoregressive Flow by introducing multiple lower-triangular matrices with ones on the diagonal and combining them using a convex combination and it is shown that it performs similarly to the linear general normalizing flow.
Improving Variational Auto-Encoders using Householder Flow
TLDR
This paper proposes a volume-preserving flow that uses a series of Householder transformations that allows to obtain more flexible variational posterior and competitive results comparing to other normalizing flows.
WaveNet: A Generative Model for Raw Audio
TLDR
WaveNet, a deep neural network for generating raw audio waveforms, is introduced; it is shown that it can be efficiently trained on data with tens of thousands of samples per second of audio, and can be employed as a discriminative model, returning promising results for phoneme recognition.
Multiplicative Normalizing Flows for Variational Bayesian Neural Networks
We reinterpret multiplicative noise in neural networks as auxiliary random variables that augment the approximate posterior in a variational setting for Bayesian neural networks. We show that through
The Neural Autoregressive Distribution Estimator
TLDR
A new approach for modeling the distribution of high-dimensional vectors of discrete variables inspired by the restricted Boltzmann machine, which outperforms other multivariate binary distribution estimators on several datasets and performs similarly to a large (but intractable) RBM.
Learnable Explicit Density for Continuous Latent Space and Variational Inference
TLDR
The decompose the learning of VAEs into layerwise density estimation, and argue that having a flexible prior is beneficial to both sample generation and inference, and analyze the family of inverse autoregressive flows (inverse AF), showing that with further improvement, inverse AF could be used as universal approximation to any complicated posterior.
...
...