• Corpus ID: 247058391

# Efficient CDF Approximations for Normalizing Flows

@article{ShamaSastry2022EfficientCA,
title={Efficient CDF Approximations for Normalizing Flows},
author={Chandramouli Shama Sastry and Andreas M. Lehrmann and Marcus A. Brubaker and Alexander Radovic},
journal={ArXiv},
year={2022},
volume={abs/2202.11322}
}
• Published 23 February 2022
• Mathematics, Computer Science
• ArXiv
Normalizing ﬂows model a complex target distribution in terms of a bijective transform operating on a simple base distribution. As such, they enable tractable computation of a number of important statistical quantities, particularly likelihoods and samples. Despite these appealing properties, the computation of more complex inference tasks, such as the cumulative distribution function (CDF) over a complex region (e.g., a polytope) remains challenging. Traditional CDF approximations using Monte…

## References

SHOWING 1-10 OF 31 REFERENCES

• Mathematics, Computer Science
NIPS
• 2017
This work describes an approach for increasing the flexibility of an autoregressive model, based on modelling the random numbers that the model uses internally when generating data, which is called Masked Autoregressive Flow.
• Computer Science
J. Mach. Learn. Res.
• 2021
This review places special emphasis on the fundamental principles of flow design, and discusses foundational topics such as expressive power and computational trade-offs, and summarizes the use of flows for tasks such as generative modeling, approximate inference, and supervised learning.
• Computer Science, Mathematics
ICLR
• 2019
This paper uses Hutchinson's trace estimator to give a scalable unbiased estimate of the log-density and demonstrates the approach on high-dimensional density estimation, image generation, and variational inference, achieving the state-of-the-art among exact likelihood methods with efficient sampling.
• Computer Science
NeurIPS
• 2018
Glow, a simple type of generative flow using an invertible 1x1 convolution, is proposed, demonstrating that a generative model optimized towards the plain log-likelihood objective is capable of efficient realistic-looking synthesis and manipulation of large images.
IEEE Transactions on Pattern Analysis and Machine Intelligence
• 2021
The goal of this survey article is to give a coherent and comprehensive review of the literature around the construction and use of Normalizing Flows for distribution learning to provide context and explanation of the models.
• Computer Science
ICML
• 2021
A Distributional Value Function Factorization (DFAC) framework to generalize expected value function factorization methods to their DFAC variants, which extends the individual utility functions from deterministic variables to random variables, and models the quantile function of the total return as a quantile mixture.
• Computer Science
NeurIPS
• 2020
This work presents a new method for handling covariate shift using the empirical cumulative distribution function estimates of the target distribution by a rigorous generalization of a recent idea proposed by Vapnik and Izmailov.
• Mathematics, Computer Science
UAI
• 2020
An approximate inference procedure is developed that allows explicit control of the bias/variance tradeoff, interpolating between the sampling and the variational regime, and uses a normalizing flow to map the integrand onto a uniform distribution.
• Computer Science
ICML
• 2020
This paper shows that variable skipping provides 10-100$\times$ efficiency improvements when targeting challenging high-quantile error metrics, enables complex applications such as text pattern matching, and can be realized via a simple data augmentation procedure without changing the usual maximum likelihood objective.