# Cubic-Spline Flows

@article{Durkan2019CubicSplineF, title={Cubic-Spline Flows}, author={Conor Durkan and Artur Bekasov and Iain Murray and George Papamakarios}, journal={ArXiv}, year={2019}, volume={abs/1906.02145} }

A normalizing flow models a complex probability density as an invertible transformation of a simple density. The invertibility means that we can evaluate densities and generate samples from a flow. In practice, autoregressive flow-based models are slow to invert, making either density estimation or sample generation slow. Flows based on coupling transforms are fast for both tasks, but have previously performed less well at density estimation than autoregressive flows. We stack a new coupling…

## 24 Citations

Neural Spline Flows

- Computer Science, MathematicsNeurIPS
- 2019

This work proposes a fully-differentiable module based on monotonic rational-quadratic splines, which enhances the flexibility of both coupling and autoregressive transforms while retaining analytic invertibility, and demonstrates that neural spline flows improve density estimation, variational inference, and generative modeling of images.

Invertible Generative Modeling using Linear Rational Splines

- Computer Science, MathematicsAISTATS
- 2020

This paper explores using linear rational splines as a replacement for affine transformations used in coupling layers using a straightforward inverse and results demonstrate the competitiveness of this approach's performance compared to existing methods.

The Convolution Exponential and Generalized Sylvester Flows

- 2020

This paper introduces a new method to build linear flows, by taking the exponential of a linear transformation. This linear transformation does not need to be invertible itself, and the exponential…

Efficient sampling generation from explicit densities via Normalizing Flows

- Computer Science, MathematicsArXiv
- 2020

This work will present a method based on normalizing flows, proposing a solution for the common problem of exploding reverse Kullback-Leibler divergence due to the target density having values of 0 in regions of the flow transformation.

Dimensionality Reduction Flows

- Mathematics, Computer ScienceArXiv
- 2019

This work proposes methods to reduce the latent space dimension of flow models via likelihood contribution based factorization of dimensions and ventures a data dependent factorization scheme which is more efficient than static counterparts in prior works.

An introduction to variational inference in geophysical inverse problems

- Computer ScienceInversion of Geophysical Data
- 2021

ADVI automatic differential variational inference c a subset of variables (clique) C a set of cliques, i.e., c C det determinant dobs observed data vector ELBO evidence lower bound EM…

Likelihood Contribution based Multi-scale Architecture for Generative Flows

- Computer Science
- 2019

A novel multi-scale architecture that performs data dependent factorization to decide which dimensions should pass through more flow layers is proposed and a heuristic based on the contribution of each dimension to the total log-likelihood which encodes the importance of the dimensions is introduced.

Normalizing Flows: Introduction and Ideas

- Mathematics, Computer ScienceArXiv
- 2019

A Normalizing Flow (NF) is family of generative models which produces tractable distributions where both sampling and density evaluation can be efficient and exact.

Multi-Asset Spot and Option Market Simulation

- Computer Science, EconomicsArXiv
- 2021

This work addresses the high-dimensionality of market observed call prices through an arbitrage-free autoencoder that approximates efficient low-dimensional representations of the prices while maintaining no static arbitrage in the reconstructed surface.

SBI - A toolkit for simulation-based inference

- Computer Science, BiologyJ. Open Source Softw.
- 2020

A PyTorch-based package that implements SBI algorithms based on neural networks facilitates inference on black-box simulators for practising scientists and engineers by providing a unified interface to state-of-the-art algorithms together with documentation and tutorials.

## References

SHOWING 1-10 OF 32 REFERENCES

Masked Autoregressive Flow for Density Estimation

- Computer Science, MathematicsNIPS
- 2017

This work describes an approach for increasing the flexibility of an autoregressive model, based on modelling the random numbers that the model uses internally when generating data, which is called Masked Autoregressive Flow.

Block Neural Autoregressive Flow

- Mathematics, Computer ScienceUAI
- 2019

Normalising flows (NFS) map two density functions via a differentiable bijection whose Jacobian determinant can be computed efficiently. Recently, as an alternative to hand-crafted bijections, Huang…

Neural Autoregressive Flows

- Computer Science, MathematicsICML
- 2018

It is demonstrated that the proposed neural autoregressive flows (NAF) are universal approximators for continuous probability distributions, and their greater expressivity allows them to better capture multimodal target distributions.

FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models

- Computer Science, MathematicsICLR
- 2019

This paper uses Hutchinson's trace estimator to give a scalable unbiased estimate of the log-density and demonstrates the approach on high-dimensional density estimation, image generation, and variational inference, achieving the state-of-the-art among exact likelihood methods with efficient sampling.

Improved Variational Inference with Inverse Autoregressive Flow

- Mathematics, Computer ScienceNIPS 2016
- 2017

A new type of normalizing flow, inverse autoregressive flow (IAF), is proposed that, in contrast to earlier published flows, scales well to high-dimensional latent spaces and significantly improves upon diagonal Gaussian approximate posteriors.

Autoregressive Energy Machines

- Computer Science, MathematicsICML
- 2019

The Autoregressive Energy Machine is proposed, an energy-based model which simultaneously learns an unnormalized density and computes an importance-sampling estimate of the normalizing constant for each conditional in an autoregressive decomposition, achieves state-of-the-art performance on a suite of density-estimation tasks.

Transformation Autoregressive Networks

- Computer Science, MathematicsICML
- 2018

This work attempts to systematically characterize methods for density estimation, and proposes multiple novel methods to model non-Markovian dependencies, and introduces a novel data driven framework for learning a family of distributions.

NICE: Non-linear Independent Components Estimation

- Computer Science, MathematicsICLR
- 2015

We propose a deep learning framework for modeling complex high-dimensional densities called Non-linear Independent Component Estimation (NICE). It is based on the idea that a good representation is…

Sylvester Normalizing Flows for Variational Inference

- Computer Science, MathematicsUAI
- 2018

Sylvester normalizing flows remove the well-known single-unit bottleneck from planar flows, making a single transformation much more flexible, and are compared against planarflows and inverse autoregressive flows.

Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows

- Computer Science, MathematicsAISTATS
- 2019

It is shown that SNL is more robust, more accurate and requires less tuning than related neural-based methods, and diagnostics for assessing calibration, convergence and goodness-of-fit are discussed.