• Corpus ID: 219176997

OT-Flow: Fast and Accurate Continuous Normalizing Flows via Optimal Transport

@inproceedings{Onken2021OTFlowFA,
  title={OT-Flow: Fast and Accurate Continuous Normalizing Flows via Optimal Transport},
  author={Derek Onken and Samy Wu Fung and Xingjian Li and Lars Ruthotto},
  booktitle={AAAI},
  year={2021}
}
A normalizing flow is an invertible mapping between an arbitrary probability distribution and a standard normal distribution; it can be used for density estimation and statistical inference. Computing the flow follows the change of variables formula and thus requires invertibility of the mapping and an efficient way to compute the determinant of its Jacobian. To satisfy these requirements, normalizing flows typically consist of carefully chosen components. Continuous normalizing flows (CNFs… 

Figures and Tables from this paper

Second-Order Neural ODE Optimizer
TLDR
A low-rank representation of the second-order derivatives is explored and it is shown that it leads to efficient preconditioned updates with the aid of Kronecker-based factorization, strengthening the OC perspective as a principled tool of analyzing optimization in deep learning.
Generative Modeling for Healthcare Applications and Energy Demand Response with Normalizing Flows
TLDR
Generative Modeling for Healthcare Applications and Energy Demand Response with Normalizing Flows shows positive results for healthcare applications and energy demand response with normalizing flows.
Jacobian Determinant of Normalizing Flows
TLDR
It is shown that the Jacobian determinant mapping is unique for the given distributions, hence the likelihood objective of flows has a unique global optimum.
Sliced Iterative Normalizing Flows
TLDR
An iterative (greedy) deep learning algorithm which is able to transform an arbitrary probability distribution function (PDF) into the target PDF and two Sliced Iterative Normalizing Flows (SINF) are introduced, which map from the data to the latent space (GIS) and vice versa (SIG).
An introduction to deep generative modeling
TLDR
DGMs are introduced and a concise mathematical framework for modeling the three most popular approaches: normalizing flows, variational autoencoders, and generative adversarial networks is provided, which illustrates the advantages and disadvantages of these basic approaches using numerical experiments.
Neural Lagrangian Schr\"odinger Bridge
Population dynamics is the study of temporal and spatial variation in the size of populations of organisms and is a major part of population ecology. One of the main difficulties in analyzing
TO-FLOW: Efficient Continuous Normalizing Flows with Temporal Optimization adjoint with Moving Speed
TLDR
A temporal optimization is proposed by optimizing the evolutionary time for forward propagation of the neural ODE training alternately with evolutionary time by coordinate descent and can be used in conjunc-tion with the original regularization approach.
A Neural Network Approach for Real-Time High-Dimensional Optimal Control
TLDR
A neural network approach for solving high-dimensional optimal control problems arising in real-time applications that fuse the HamiltonJacobi-Bellman (HJB) and Pontryagin Maximum Principle approaches by parameterizing the value function with an NN, and empirically observe that the number of parameters in the approach scales linearly with the dimension of the control problem, thereby mitigating the curse of dimensionality.
N-ODE Transformer: A Depth-Adaptive Variant of the Transformer Using Neural Ordinary Differential Equations
TLDR
It is found that the depth-adaptivity of the N-ODE Transformer does not provide a remedy for the inherently nonlocal nature of the parity problem, and explanations for why this is so are provided.
A Neural Network Approach for High-Dimensional Optimal Control Applied to Multiagent Path Finding
TLDR
A neural network approach that yields approximate solutions for high-dimensional optimal control problems and empirically observe that the number of parameters in the approach scales linearly with the dimension of the control problem, thereby mitigating the curse of dimensionality.
...
...

References

SHOWING 1-10 OF 73 REFERENCES
How to Train Your Neural ODE: the World of Jacobian and Kinetic Regularization
TLDR
This paper introduces a theoretically-grounded combination of both optimal transport and stability regularizations which encourage neural ODEs to prefer simpler dynamics out of all the dynamics that solve a problem well, leading to faster convergence and to fewer discretizations of the solver.
How to train your neural ODE
TLDR
This paper introduces a theoretically-grounded combination of both optimal transport and stability regularizations which encourage neural ODEs to prefer simpler dynamics out of all the dynamics that solve a problem well, resulting in considerably decreasing wall-clock time without loss in performance.
Normalizing Flows for Probabilistic Modeling and Inference
TLDR
This review places special emphasis on the fundamental principles of flow design, and discusses foundational topics such as expressive power and computational trade-offs, and summarizes the use of flows for tasks such as generative modeling, approximate inference, and supervised learning.
Unconstrained Monotonic Neural Networks
TLDR
This work proposes the Unconstrained Monotonic Neural Network (UMNN) architecture based on the insight that a function is monotonic as long as its derivative is strictly positive and demonstrates the ability of UMNNs to improve variational inference.
Computational Optimal Transport
TLDR
This short book reviews OT with a bias toward numerical methods and their applications in data sciences, and sheds lights on the theoretical properties of OT that make it particularly useful for some of these applications.
FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models
TLDR
This paper uses Hutchinson's trace estimator to give a scalable unbiased estimate of the log-density and demonstrates the approach on high-dimensional density estimation, image generation, and variational inference, achieving the state-of-the-art among exact likelihood methods with efficient sampling.
Neural Ordinary Differential Equations
TLDR
This work shows how to scalably backpropagate through any ODE solver, without access to its internal operations, which allows end-to-end training of ODEs within larger models.
Continuous-Time Flows for Efficient Inference and Density Estimation
TLDR
This paper proposes the concept of continuous-time flows (CTFs), a family of diffusion-based methods that are able to asymptotically approach a target distribution and demonstrates promising performance of the proposed CTF framework, compared to related techniques.
Maximum Principle Based Algorithms for Deep Learning
TLDR
The continuous dynamical system approach to deep learning is explored in order to devise alternative frameworks for training algorithms using the Pontryagin's maximum principle, demonstrating that it obtains favorable initial convergence rate per-iteration, provided Hamiltonian maximization can be efficiently carried out.
Density estimation using Real NVP
TLDR
This work extends the space of probabilistic models using real-valued non-volume preserving (real NVP) transformations, a set of powerful invertible and learnable transformations, resulting in an unsupervised learning algorithm with exact log-likelihood computation, exact sampling, exact inference of latent variables, and an interpretable latent space.
...
...