Potential Flow Generator with $L_2$ Optimal Transport Regularity for Generative Models

@article{Yang2020PotentialFG,
  title={Potential Flow Generator with \$L\_2\$ Optimal Transport Regularity for Generative Models},
  author={Liu Yang and George Em Karniadakis},
  journal={IEEE transactions on neural networks and learning systems},
  year={2020},
  volume={PP}
}
  • Liu Yang, G. Karniadakis
  • Published 29 August 2019
  • Computer Science, Mathematics, Medicine
  • IEEE transactions on neural networks and learning systems
We propose a potential flow generator with L₂ optimal transport regularity, which can be easily integrated into a wide range of generative models, including different versions of generative adversarial networks (GANs) and normalizing flow models. With only a slight augmentation to the original generator loss functions, our generator not only tries to transport the input distribution to the target one but also aims to find the one with minimum L₂ transport cost. We show the effectiveness of our… 
Optimal transport mapping via input convex neural networks
TLDR
This approach ensures that the transport mapping the authors find is optimal independent of how they initialize the neural networks, as gradient of a convex function naturally models a discontinuous transport mapping.
Sparse Flows: Pruning Continuous-depth Models
TLDR
This work designs a framework to decipher the internal dynamics of these continuous depth models by pruning their network architectures, and empirical results suggest that pruning improves generalization for neural ODEs in generative modeling.
Jacobian Determinant of Normalizing Flows
TLDR
It is shown that the Jacobian determinant mapping is unique for the given distributions, hence the likelihood objective of flows has a unique global optimum.
How to train your neural ODE
TLDR
This paper introduces a theoretically-grounded combination of both optimal transport and stability regularizations which encourage neural ODEs to prefer simpler dynamics out of all the dynamics that solve a problem well, resulting in considerably decreasing wall-clock time without loss in performance.
Augmented KRnet for density estimation and approximation
TLDR
The augmented KRnet is proposed including both discrete and continuous models and can be reformulated as the discretization of a neural ODE, where the exact invertibility is kept such that the adjoint method can be formulated with respect to the Discretized ODE to obtain the exact gradient.
A Neural Network Approach for High-Dimensional Optimal Control
We propose a neural network approach for solving high-dimensional optimal control problems arising in real-time applications. Our approach yields controls in a feedback form and can therefore handle
A machine learning framework for solving high-dimensional mean field game and mean field control problems
TLDR
This paper provides a flexible machine learning framework for the numerical solution of potential MFG and MFC models by combining Lagrangian and Eulerian viewpoints and leveraging recent advances from machine learning.
A Neural Network Approach for Real-Time High-Dimensional Optimal Control
We propose a neural network approach for solving high-dimensional optimal control problems arising in real-time applications. Our approach yields controls in a feedback form, where the policy
Scalable Computation of Monge Maps with General Costs
TLDR
This paper presents a scalable algorithm based on a weak form of the optimal transport problem, thus it only requires samples from the marginals instead of their analytic expressions, and can accommodate optimal transport between two distributions with different dimensions.
An Introduction to Deep Generative Modeling
TLDR
An introduction to DGMs is provided and a concise mathematical framework for modeling the three most popular approaches: normalizing flows (NF), variational autoencoders (VAE), and generative adversarial networks (GAN) is provided; the advantages and disadvantages are illustrated using numerical experiments.
...
1
2
...

References

SHOWING 1-10 OF 29 REFERENCES
Adversarial Computation of Optimal Transport Maps
TLDR
This work proposes a generative adversarial model in which the discriminator's objective is the $2-Wasserstein metric, and shows that during training, the generator follows the $W_2$-geodesic between the initial and the target distributions, and reproduces an optimal map at the end of training.
Scalable Unbalanced Optimal Transport using Generative Adversarial Networks
TLDR
This paper presents a scalable method for unbalanced optimal transport (OT) based on the generative-adversarial framework, and proposes an algorithm for solving this problem based on stochastic alternating gradient updates, similar in practice to GANs.
Glow: Generative Flow with Invertible 1x1 Convolutions
TLDR
Glow, a simple type of generative flow using an invertible 1x1 convolution, is proposed, demonstrating that a generative model optimized towards the plain log-likelihood objective is capable of efficient realistic-looking synthesis and manipulation of large images.
Improved Training of Wasserstein GANs
TLDR
This work proposes an alternative to clipping weights: penalize the norm of gradient of the critic with respect to its input, which performs better than standard WGAN and enables stable training of a wide variety of GAN architectures with almost no hyperparameter tuning.
Improving GANs Using Optimal Transport
TLDR
Optimal Transport GAN (OT-GAN), a variant of generative adversarial nets minimizing a new metric measuring the distance between the generator distribution and the data distribution, resulting in a highly discriminative distance function with unbiased mini-batch gradients is presented.
Generative Modeling Using the Sliced Wasserstein Distance
TLDR
This work considers an alternative formulation for generative modeling based on random projections which, in its simplest form, results in a single objective rather than a saddle-point formulation and finds its approach to be significantly more stable compared to even the improved Wasserstein GAN.
Generative Adversarial Nets
We propose a new framework for estimating generative models via an adversarial process, in which we simultaneously train two models: a generative model G that captures the data distribution, and a
Image-to-Image Translation with Conditional Adversarial Networks
TLDR
Conditional adversarial networks are investigated as a general-purpose solution to image-to-image translation problems and it is demonstrated that this approach is effective at synthesizing photos from label maps, reconstructing objects from edge maps, and colorizing images, among other tasks.
DualGAN: Unsupervised Dual Learning for Image-to-Image Translation
TLDR
A novel dual-GAN mechanism is developed, which enables image translators to be trained from two sets of unlabeled images from two domains, and can even achieve comparable or slightly better results than conditional GAN trained on fully labeled data.
Wasserstein Generative Adversarial Networks
TLDR
This work introduces a new algorithm named WGAN, an alternative to traditional GAN training that can improve the stability of learning, get rid of problems like mode collapse, and provide meaningful learning curves useful for debugging and hyperparameter searches.
...
1
2
3
...