• Corpus ID: 247058391

Efficient CDF Approximations for Normalizing Flows

@article{ShamaSastry2022EfficientCA,
  title={Efficient CDF Approximations for Normalizing Flows},
  author={Chandramouli Shama Sastry and Andreas M. Lehrmann and Marcus A. Brubaker and Alexander Radovic},
  journal={ArXiv},
  year={2022},
  volume={abs/2202.11322}
}
Normalizing flows model a complex target distribution in terms of a bijective transform operating on a simple base distribution. As such, they enable tractable computation of a number of important statistical quantities, particularly likelihoods and samples. Despite these appealing properties, the computation of more complex inference tasks, such as the cumulative distribution function (CDF) over a complex region (e.g., a polytope) remains challenging. Traditional CDF approximations using Monte… 

References

SHOWING 1-10 OF 31 REFERENCES

Masked Autoregressive Flow for Density Estimation

This work describes an approach for increasing the flexibility of an autoregressive model, based on modelling the random numbers that the model uses internally when generating data, which is called Masked Autoregressive Flow.

Normalizing Flows for Probabilistic Modeling and Inference

This review places special emphasis on the fundamental principles of flow design, and discusses foundational topics such as expressive power and computational trade-offs, and summarizes the use of flows for tasks such as generative modeling, approximate inference, and supervised learning.

FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models

This paper uses Hutchinson's trace estimator to give a scalable unbiased estimate of the log-density and demonstrates the approach on high-dimensional density estimation, image generation, and variational inference, achieving the state-of-the-art among exact likelihood methods with efficient sampling.

Glow: Generative Flow with Invertible 1x1 Convolutions

Glow, a simple type of generative flow using an invertible 1x1 convolution, is proposed, demonstrating that a generative model optimized towards the plain log-likelihood objective is capable of efficient realistic-looking synthesis and manipulation of large images.

Normalizing Flows: An Introduction and Review of Current Methods

The goal of this survey article is to give a coherent and comprehensive review of the literature around the construction and use of Normalizing Flows for distribution learning to provide context and explanation of the models.

DFAC Framework: Factorizing the Value Function via Quantile Mixture for Multi-Agent Distributional Q-Learning

A Distributional Value Function Factorization (DFAC) framework to generalize expected value function factorization methods to their DFAC variants, which extends the individual utility functions from deterministic variables to random variables, and models the quantile function of the total return as a quantile mixture.

A Review of Uncertainty Quantification in Deep Learning: Techniques, Applications and Challenges

Robust Correction of Sampling Bias Using Cumulative Distribution Functions

This work presents a new method for handling covariate shift using the empirical cumulative distribution function estimates of the target distribution by a rigorous generalization of a recent idea proposed by Vapnik and Izmailov.

Flexible Approximate Inference via Stratified Normalizing Flows

An approximate inference procedure is developed that allows explicit control of the bias/variance tradeoff, interpolating between the sampling and the variational regime, and uses a normalizing flow to map the integrand onto a uniform distribution.

Variable Skipping for Autoregressive Range Density Estimation

This paper shows that variable skipping provides 10-100$\times$ efficiency improvements when targeting challenging high-quantile error metrics, enables complex applications such as text pattern matching, and can be realized via a simple data augmentation procedure without changing the usual maximum likelihood objective.