• Corpus ID: 7166013

Masked Autoregressive Flow for Density Estimation

@article{Papamakarios2017MaskedAF,
  title={Masked Autoregressive Flow for Density Estimation},
  author={George Papamakarios and Iain Murray and Theo Pavlakou},
  journal={ArXiv},
  year={2017},
  volume={abs/1705.07057}
}
Autoregressive models are among the best performing neural density estimators. [] Key Result Masked Autoregressive Flow achieves state-of-the-art performance in a range of general-purpose density estimation tasks.

Figures and Tables from this paper

Cubic-Spline Flows
TLDR
This work stacks a new coupling transform, based on monotonic cubic splines, with LU-decomposed linear layers, which retains an exact one-pass inverse, can be used to generate high-quality images, and closes the gap with autoregressive flows on a suite of density-estimation tasks.
Autoregressive Energy Machines
TLDR
The Autoregressive Energy Machine is proposed, an energy-based model which simultaneously learns an unnormalized density and computes an importance-sampling estimate of the normalizing constant for each conditional in an autoregressive decomposition, achieves state-of-the-art performance on a suite of density-estimation tasks.
Autoregressive Quantile Flows for Predictive Uncertainty Estimation
TLDR
Autoregressive Quantile Flows are instances of autoregressive flows trained using a novel objective based on proper scoring rules, which simplifies the calculation of computationally expensive determinants of Jacobians during training and supports new types of neural architectures.
Neural Autoregressive Flows
TLDR
It is demonstrated that the proposed neural autoregressive flows (NAF) are universal approximators for continuous probability distributions, and their greater expressivity allows them to better capture multimodal target distributions.
A Triangular Network For Density Estimation
TLDR
This work reports a triangular neural network implementation of neural autoregressive flow that achieves state-of-the-art bits-per-dimension indices on MNIST and CIFAR-10 and falls in the category of general-purpose density estimators.
Towards Recurrent Autoregressive Flow Models
TLDR
This work presents Recurrent Autoregressive Flows as a method toward general stochastic process modeling with normalizing flows and presents an initial design for a recurrent flow cell and a method to train the model to match observed empirical distributions.
Quasi-Autoregressive Residual (QuAR) Flows
TLDR
This paper introduces a simplification to residual flows using a Quasi-Autoregressive (QuAR) approach, which retains many of the benefits of residual flows while dramatically reducing the compute time and memory requirements, thus making flow-based modeling approaches far more tractable and broadening their potential applicability.
Improving sequential latent variable models with autoregressive flows
TLDR
This technique provides a simple, general-purpose method for improving sequence modeling, with connections to existing and classical techniques, and demonstrates the decorrelation and improved generalization properties of using flow-based dynamics.
Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows
TLDR
It is shown that SNL is more robust, more accurate and requires less tuning than related neural-based methods, and diagnostics for assessing calibration, convergence and goodness-of-fit are discussed.
Probabilistic Time Series Forecasts with Autoregressive Transformation Models
TLDR
This paper proposes Autoregressive Transformation Models (ATMs), a model class inspired from various research directions to unite expressive distributional forecasts using a semi-parametric distribution assumption with an interpretable model specification and allow for uncertainty quantification based on (asymptotic) Maximum Likelihood theory.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 47 REFERENCES
Neural Autoregressive Distribution Estimation
We present Neural Autoregressive Distribution Estimation (NADE) models, which are neural network architectures applied to the problem of unsupervised distribution and density estimation. They
Improved Variational Inference with Inverse Autoregressive Flow
TLDR
A new type of normalizing flow, inverse autoregressive flow (IAF), is proposed that, in contrast to earlier published flows, scales well to high-dimensional latent spaces and significantly improves upon diagonal Gaussian approximate posteriors.
RNADE: The real-valued neural autoregressive density-estimator
We introduce RNADE, a new model for joint density estimation of real-valued vectors. Our model calculates the density of a datapoint as the product of one-dimensional conditionals modeled using
A Deep and Tractable Density Estimator
TLDR
This work introduces an efficient procedure to simultaneously train a NADE model for each possible ordering of the variables, by sharing parameters across all these models.
Inference Networks for Sequential Monte Carlo in Graphical Models
TLDR
A procedure for constructing and learning a structured neural network which represents an inverse factorization of the graphical model, resulting in a conditional density estimator that takes as input particular values of the observed random variables, and returns an approximation to the distribution of the latent variables.
Stochastic Backpropagation and Approximate Inference in Deep Generative Models
We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised class of deep, directed generative models, endowed with a new algorithm for scalable inference and
Density estimation using Real NVP
TLDR
This work extends the space of probabilistic models using real-valued non-volume preserving (real NVP) transformations, a set of powerful invertible and learnable transformations, resulting in an unsupervised learning algorithm with exact log-likelihood computation, exact sampling, exact inference of latent variables, and an interpretable latent space.
Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation
TLDR
This work proposes a new approach to likelihood-free inference based on Bayesian conditional density estimation, which requires fewer model simulations than Monte Carlo ABC methods need to produce a single sample from an approximate posterior.
Maximum Entropy Flow Networks
TLDR
This paper learns a smooth and invertible transformation that maps a simple distribution to the desired maximum entropy distribution, and cast the maximum entropy problem into a finite-dimensional constrained optimization, and solve the problem by combining stochastic optimization with the augmented Lagrangian method.
Gaussianization
TLDR
This work proposes an iterative Gaussianization procedure which converges weakly: at each iteration, the data is first transformed to the least dependent coordinates and then each coordinate is marginally Gaussianized by univariate techniques.
...
1
2
3
4
5
...