Taming hyperparameter tuning in continuous normalizing flows using the JKO scheme
@article{Vidal2022TamingHT, title={Taming hyperparameter tuning in continuous normalizing flows using the JKO scheme}, author={Alexander Vidal and Samy Wu Fung and Luis Tenorio and Stanley J. Osher and Levon Nurbekyan}, journal={Scientific Reports}, year={2022}, volume={13} }
A normalizing flow (NF) is a mapping that transforms a chosen probability distribution to a normal distribution. Such flows are a common technique used for data generation and density estimation in machine learning and data science. The density estimate obtained with a NF requires a change of variables formula that involves the computation of the Jacobian determinant of the NF transformation. In order to tractably compute this determinant, continuous normalizing flows (CNF) estimate the mapping…
References
SHOWING 1-10 OF 77 REFERENCES
OT-Flow: Fast and Accurate Continuous Normalizing Flows via Optimal Transport
- Computer ScienceAAAI
- 2021
The proposed OT-Flow approach tackles two critical computational challenges that limit a more widespread use of CNFs, and leverages optimal transport (OT) theory to regularize the CNF and enforce straight trajectories that are easier to integrate.
Normalizing Flows for Probabilistic Modeling and Inference
- Computer ScienceJ. Mach. Learn. Res.
- 2021
This review places special emphasis on the fundamental principles of flow design, and discusses foundational topics such as expressive power and computational trade-offs, and summarizes the use of flows for tasks such as generative modeling, approximate inference, and supervised learning.
Variational Inference with Normalizing Flows
- Computer Science, MathematicsICML
- 2015
It is demonstrated that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference.
Optimizing Functionals on the Space of Probabilities with Input Convex Neural Networks
- Computer ScienceArXiv
- 2021
An approach that relies on the recently introduced input-convex neural networks (ICNN) to parametrize the space of convex functions in order to approximate the JKO scheme is proposed, as well as in designing functionals over measures that enjoy convergence guarantees.
Variational Wasserstein gradient flow
- Computer ScienceICML
- 2022
This paper proposes to utilize a variational formulation of the objective function formulated as maximization over a parametric class of functions, to handle objective functions involving density, with inner loop updates that only require a small batch of samples and scale well with the dimension.
Convex Potential Flows: Universal Probability Distributions with Optimal Transport and Convex Optimization
- Computer ScienceICLR
- 2021
This paper introduces Convex Potential Flows (CP-Flow), a natural and efficient parameterization of invertible models inspired by the optimal transport (OT) theory, and proves that CP-Flows are universal density approximators and are optimal in the OT sense.
On the Convergence and Robustness of Training GANs with Regularized Optimal Transport
- Computer ScienceNeurIPS
- 2018
This work shows that obtaining gradient information of the smoothed Wasserstein GAN formulation, which is based on regularized Optimal Transport (OT), is computationally effortless and hence one can apply first order optimization methods to minimize this objective.
Continuous-Time Flows for Efficient Inference and Density Estimation
- Computer ScienceICML
- 2018
This paper proposes the concept of continuous-time flows (CTFs), a family of diffusion-based methods that are able to asymptotically approach a target distribution and demonstrates promising performance of the proposed CTF framework, compared to related techniques.
FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models
- Computer Science, MathematicsICLR
- 2019
This paper uses Hutchinson's trace estimator to give a scalable unbiased estimate of the log-density and demonstrates the approach on high-dimensional density estimation, image generation, and variational inference, achieving the state-of-the-art among exact likelihood methods with efficient sampling.
Neural Spline Flows
- MathematicsNeurIPS
- 2019
This work proposes a fully-differentiable module based on monotonic rational-quadratic splines, which enhances the flexibility of both coupling and autoregressive transforms while retaining analytic invertibility, and demonstrates that neural spline flows improve density estimation, variational inference, and generative modeling of images.