• Corpus ID: 211258728

Predictive Sampling with Forecasting Autoregressive Models

@inproceedings{Wiggers2020PredictiveSW,
  title={Predictive Sampling with Forecasting Autoregressive Models},
  author={Auke J. Wiggers and Emiel Hoogeboom},
  booktitle={International Conference on Machine Learning},
  year={2020}
}
Autoregressive models (ARMs) currently hold state-of-the-art performance in likelihood-based modeling of image and audio data. Generally, neural network based ARMs are designed to allow fast inference, but sampling from these models is impractically slow. In this paper, we introduce the predictive sampling algorithm: a procedure that exploits the fast inference property of ARMs in order to speed up sampling, while keeping the model intact. We propose two variations of predictive sampling… 

Locally Masked Convolution for Autoregressive Models

LMConv is introduced: a simple modification to the standard 2D convolution that allows arbitrary masks to be applied to the weights at each location in the image, achieving improved performance on whole-image density estimation and globally coherent image completions.

Denoising Diffusion Probabilistic Models

High quality image synthesis results are presented using diffusion probabilistic models, a class of latent variable models inspired by considerations from nonequilibrium thermodynamics, which naturally admit a progressive lossy decompression scheme that can be interpreted as a generalization of autoregressive decoding.

Parallelized Rate-Distortion Optimized Quantization Using Deep Learning

This work trains two classes of neural networks, a fully-convolutional network and an auto-regressive network, and evaluates each as a post-quantization step designed to refine cheap quantization schemes such as scalar quantization (SQ).

The Convolution Exponential and Generalized Sylvester Flows

A new method to build linear flows, by taking the exponential of a linear transformation, which outperforms other linear transformations in generative flows on CIFAR10 and the graph convolution exponential improves the performance of graph normalizing flows.

The Convolution Exponential and Generalized Sylvester Flows

This paper introduces a new method to build linear flows, by taking the exponential of a linear transformation. This linear transformation does not need to be invertible itself, and the exponential

A Review of the Gumbel-max Trick and its Extensions for Discrete Stochasticity in Machine Learning

The goal of this survey article is to present background about the Gumbel-max trick, and to provide a structured overview of its extensions to ease algorithm selection, and presents a comprehensive outline of (machine learning) literature in which Gumbal-based algorithms have been leveraged.

Auto-regressive Image Synthesis with Integrated Quantization

This paper designs an integrated quantization scheme with a variational regularizer that mingles the feature discretization in multiple domains, and markedly boosts the auto-regressive modeling performance and designs a Gumbel sampling strategy that allows to incorporate distribution uncertainty into theAuto-regression training procedure.

ELF: Exact-Lipschitz Based Universal Density Approximator Flow

A new Exact-Lipschitz Flow (ELF) is introduced that combines the ease of sampling from residual flows with the strong performance of autoregressive flows, and achieves state-of-the-art performance on multiple largescale datasets.

PixelPyramids: Exact Inference Models from Lossless Image Pyramids

  • Shweta MahajanS. Roth
  • Computer Science
    2021 IEEE/CVF International Conference on Computer Vision (ICCV)
  • 2021
Autoregressive models are a class of exact inference approaches with highly flexible functional forms, yielding state-of-the-art density estimates for natural images. Yet, the sequential ordering on

Autoregressive Diffusion Models

Autoregressive Diffusion Models are introduced, a model class encom-passing and generalizing order-agnostic autoregressive models (Uria et and absorbing discrete diffusion) which show are special cases of ARDMs under mild assumptions.

References

SHOWING 1-10 OF 34 REFERENCES

PixelSNAIL: An Improved Autoregressive Generative Model

This work introduces a new generative model architecture that combines causal convolutions with self attention and presents state-of-the-art log-likelihood results on CIFAR-10 and ImageNet.

Parallel Multiscale Autoregressive Density Estimation

This work proposes a parallelized PixelCNN that allows more efficient inference by modeling certain pixel groups as conditionally independent, achieving competitive density estimation and orders of magnitude speedup - O(log N) sampling instead of O(N) - enabling the practical generation of 512x512 images.

MADE: Masked Autoencoder for Distribution Estimation

This work introduces a simple modification for autoencoder neural networks that yields powerful generative models and proves that this approach is competitive with state-of-the-art tractable distribution estimators.

Neural Discrete Representation Learning

Pairing these representations with an autoregressive prior, the model can generate high quality images, videos, and speech as well as doing high quality speaker conversion and unsupervised learning of phonemes, providing further evidence of the utility of the learnt representations.

Fast Generation for Convolutional Autoregressive Models

This work describes a method to speed up generation in convolutional autoregressive models to cache hidden states to avoid redundant computation, and applies this method to the Wavenet and PixelCNN++ models.

The Neural Autoregressive Distribution Estimator

A new approach for modeling the distribution of high-dimensional vectors of discrete variables inspired by the restricted Boltzmann machine, which outperforms other multivariate binary distribution estimators on several datasets and performs similarly to a large (but intractable) RBM.

Density estimation using Real NVP

This work extends the space of probabilistic models using real-valued non-volume preserving (real NVP) transformations, a set of powerful invertible and learnable transformations, resulting in an unsupervised learning algorithm with exact log-likelihood computation, exact sampling, exact inference of latent variables, and an interpretable latent space.

Blockwise Parallel Decoding for Deep Autoregressive Models

This work proposes a novel blockwise parallel decoding scheme in which it makes predictions for multiple time steps in parallel then back off to the longest prefix validated by a scoring model, which allows for substantial theoretical improvements in generation speed when applied to architectures that can process output sequences in parallel.

Stochastic Backpropagation and Approximate Inference in Deep Generative Models

We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised class of deep, directed generative models, endowed with a new algorithm for scalable inference and

WaveNet: A Generative Model for Raw Audio

WaveNet, a deep neural network for generating raw audio waveforms, is introduced; it is shown that it can be efficiently trained on data with tens of thousands of samples per second of audio, and can be employed as a discriminative model, returning promising results for phoneme recognition.