Corpus ID: 21082355

Linking Generative Adversarial Learning and Binary Classification

@article{Balsubramani2017LinkingGA,
  title={Linking Generative Adversarial Learning and Binary Classification},
  author={A. Balsubramani},
  journal={ArXiv},
  year={2017},
  volume={abs/1709.01509}
}
In this note, we point out a basic link between generative adversarial (GA) training and binary classification -- any powerful discriminator essentially computes an (f-)divergence between real and generated samples. The result, repeatedly re-derived in decision theory, has implications for GA Networks (GANs), providing an alternative perspective on training f-GANs by designing the discriminator loss function. 

References

SHOWING 1-9 OF 9 REFERENCES
BEGAN: Boundary Equilibrium Generative Adversarial Networks
TLDR
This work proposes a new equilibrium enforcing method paired with a loss derived from the Wasserstein distance for training auto-encoder based Generative Adversarial Networks, which provides a new approximate convergence measure, fast and stable training and high visual quality. Expand
Generative Adversarial Nets
We propose a new framework for estimating generative models via an adversarial process, in which we simultaneously train two models: a generative model G that captures the data distribution, and aExpand
f-GAN: Training Generative Neural Samplers using Variational Divergence Minimization
TLDR
It is shown that any f-divergence can be used for training generative neural samplers and the benefits of various choices of divergence functions on training complexity and the quality of the obtained generative models are discussed. Expand
Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks
TLDR
This work introduces a class of CNNs called deep convolutional generative adversarial networks (DCGANs), that have certain architectural constraints, and demonstrates that they are a strong candidate for unsupervised learning. Expand
Information, Divergence and Risk for Binary Experiments
TLDR
The new viewpoint also illuminates existing algorithms: it provides a new derivation of Support Vector Machines in terms of divergences and relates maximum mean discrepancy to Fisher linear discriminants. Expand
ON surrogate loss functions and f-divergences
TLDR
This work considers an elaboration of binary classification in which the covariates are not available directly but are transformed by a dimensionality-reducing quantizer Q, and makes it possible to pick out the (strict) subset of surrogate loss functions that yield Bayes consistency for joint estimation of the discriminant function and the quantizer. Expand
On Divergences and Informations in Statistics and Information Theory
  • F. Liese, I. Vajda
  • Mathematics, Computer Science
  • IEEE Transactions on Information Theory
  • 2006
The paper deals with the f-divergences of Csiszar generalizing the discrimination information of Kullback, the total variation distance, the Hellinger divergence, and the Pearson divergence. AllExpand
Estimating Divergence Functionals and the Likelihood Ratio by Convex Risk Minimization
We develop and analyze M-estimation methods for divergence functionals and the likelihood ratios of two probability distributions. Our method is based on a nonasymptotic variational characterizationExpand
Generative adversarial nets On divergences and informations in statistics and information theory
  • IEEE Transactions on Information Theory