# Neural Autoregressive Flows

@article{Huang2018NeuralAF, title={Neural Autoregressive Flows}, author={Chin-Wei Huang and David Krueger and Alexandre Lacoste and Aaron C. Courville}, journal={ArXiv}, year={2018}, volume={abs/1804.00779} }

Normalizing flows and autoregressive models have been successfully combined to produce state-of-the-art results in density estimation, via Masked Autoregressive Flows (MAF), and to accelerate state-of-the-art WaveNet-based speech synthesis to 20x faster than real-time, via Inverse Autoregressive Flows (IAF. [... ] Key Result Experimentally, NAF yields state-of-the-art performance on a suite of density estimation tasks and outperforms IAF in variational autoencoders trained on binarized MNIST. Expand

## Figures and Tables from this paper

## 262 Citations

Block Neural Autoregressive Flow

- Computer ScienceUAI
- 2019

Normalising flows (NFS) map two density functions via a differentiable bijection whose Jacobian determinant can be computed efficiently. Recently, as an alternative to hand-crafted bijections, Huang…

Quasi-Autoregressive Residual (QuAR) Flows

- Computer ScienceArXiv
- 2020

This paper introduces a simplification to residual flows using a Quasi-Autoregressive (QuAR) approach, which retains many of the benefits of residual flows while dramatically reducing the compute time and memory requirements, thus making flow-based modeling approaches far more tractable and broadening their potential applicability.

Data-driven Estimation of Background Distribution through Neural Autoregressive Flows

- Computer Science
- 2020

A general and automatic data-driven background distribution shape estimation method using neural autoregressive flows (NAF), which is one of the deep generative learning methods, and it is demonstrated that the prediction through ABCDnn method is similar to optimal case, while having smaller statistical uncertainty.

Unconstrained Monotonic Neural Networks

- Computer ScienceBNAIC/BENELEARN
- 2019

This work proposes the Unconstrained Monotonic Neural Network (UMNN) architecture based on the insight that a function is monotonic as long as its derivative is strictly positive and demonstrates the ability of UMNNs to improve variational inference.

Towards Recurrent Autoregressive Flow Models

- Computer ScienceArXiv
- 2020

This work presents Recurrent Autoregressive Flows as a method toward general stochastic process modeling with normalizing flows and presents an initial design for a recurrent flow cell and a method to train the model to match observed empirical distributions.

Latent Normalizing Flows for Discrete Sequences

- Computer ScienceICML
- 2019

A VAE-based generative model is proposed which jointly learns a normalizing flow-based distribution in the latent space and a stochastic mapping to an observed discrete space in this setting, finding that it is crucial for the flow- based distribution to be highly multimodal.

Improving sequential latent variable models with autoregressive flows

- Computer ScienceAABI
- 2019

This technique provides a simple, general-purpose method for improving sequence modeling, with connections to existing and classical techniques, and demonstrates the decorrelation and improved generalization properties of using flow-based dynamics.

Cubic-Spline Flows

- MathematicsICML 2019
- 2019

This work stacks a new coupling transform, based on monotonic cubic splines, with LU-decomposed linear layers, which retains an exact one-pass inverse, can be used to generate high-quality images, and closes the gap with autoregressive flows on a suite of density-estimation tasks.

Flow++: Improving Flow-Based Generative Models with Variational Dequantization and Architecture Design

- Computer ScienceICML
- 2019

Flow++ is proposed, a new flow-based model that is now the state-of-the-art non-autoregressive model for unconditional density estimation on standard image benchmarks, and has begun to close the significant performance gap that has so far existed between autoregressive models and flow- based models.

Stochastic Neural Network with Kronecker Flow

- Computer ScienceAISTATS
- 2020

This work presents the Kronecker Flow, a generalization of the KrOnecker product to invertible mappings designed for stochastic neural networks, and applies this method to variational Bayesian neural networks on predictive tasks, PAC-Bayes generalization bound estimation, and approximate Thompson sampling in contextual bandits.

## References

SHOWING 1-10 OF 44 REFERENCES

Improved Variational Inference with Inverse Autoregressive Flow

- Computer ScienceNIPS 2016
- 2017

A new type of normalizing flow, inverse autoregressive flow (IAF), is proposed that, in contrast to earlier published flows, scales well to high-dimensional latent spaces and significantly improves upon diagonal Gaussian approximate posteriors.

Transformation Autoregressive Networks

- Computer ScienceICML
- 2018

This work attempts to systematically characterize methods for density estimation, and proposes multiple novel methods to model non-Markovian dependencies, and introduces a novel data driven framework for learning a family of distributions.

Masked Autoregressive Flow for Density Estimation

- Mathematics, Computer ScienceNIPS
- 2017

This work describes an approach for increasing the flexibility of an autoregressive model, based on modelling the random numbers that the model uses internally when generating data, which is called Masked Autoregressive Flow.

MADE: Masked Autoencoder for Distribution Estimation

- Computer ScienceICML
- 2015

This work introduces a simple modification for autoencoder neural networks that yields powerful generative models and proves that this approach is competitive with state-of-the-art tractable distribution estimators.

Improving Variational Auto-Encoders using convex combination linear Inverse Autoregressive Flow

- Computer Science
- 2017

The idea is to enrich a linear Inverse Autoregressive Flow by introducing multiple lower-triangular matrices with ones on the diagonal and combining them using a convex combination and it is shown that it performs similarly to the linear general normalizing flow.

Improving Variational Auto-Encoders using Householder Flow

- Computer ScienceArXiv
- 2016

This paper proposes a volume-preserving flow that uses a series of Householder transformations that allows to obtain more flexible variational posterior and competitive results comparing to other normalizing flows.

WaveNet: A Generative Model for Raw Audio

- Computer ScienceSSW
- 2016

WaveNet, a deep neural network for generating raw audio waveforms, is introduced; it is shown that it can be efficiently trained on data with tens of thousands of samples per second of audio, and can be employed as a discriminative model, returning promising results for phoneme recognition.

Multiplicative Normalizing Flows for Variational Bayesian Neural Networks

- Computer ScienceICML
- 2017

We reinterpret multiplicative noise in neural networks as auxiliary random variables that augment the approximate posterior in a variational setting for Bayesian neural networks. We show that through…

The Neural Autoregressive Distribution Estimator

- Computer ScienceAISTATS
- 2011

A new approach for modeling the distribution of high-dimensional vectors of discrete variables inspired by the restricted Boltzmann machine, which outperforms other multivariate binary distribution estimators on several datasets and performs similarly to a large (but intractable) RBM.

Learnable Explicit Density for Continuous Latent Space and Variational Inference

- Computer ScienceArXiv
- 2017

The decompose the learning of VAEs into layerwise density estimation, and argue that having a flexible prior is beneficial to both sample generation and inference, and analyze the family of inverse autoregressive flows (inverse AF), showing that with further improvement, inverse AF could be used as universal approximation to any complicated posterior.