# Sampling Generative Networks

@article{White2016SamplingGN, title={Sampling Generative Networks}, author={Tom White}, journal={arXiv: Neural and Evolutionary Computing}, year={2016} }

We introduce several techniques for sampling and visualizing the latent spaces of generative models. Replacing linear interpolation with spherical linear interpolation prevents diverging from a model's prior distribution and produces sharper samples. J-Diagrams and MINE grids are introduced as visualizations of manifolds created by analogies and nearest neighbors. We demonstrate two new techniques for deriving attribute vectors: bias-corrected vectors with data replication and synthetic vectors…

## Figures, Tables, and Topics from this paper

## 108 Citations

Generalized Latent Variable Recovery for Generative Adversarial Networks

- Computer Science, MathematicsArXiv
- 2018

This work extends Generator of a Generative Adversarial Network techniques to latent spaces with a Gaussian prior, and demonstrates the technique's effectiveness.

Optimal transport maps for distribution preserving operations on latent spaces of Generative Models

- Computer Science, MathematicsICLR
- 2019

This paper proposes to use distribution matching transport maps to ensure that such latent space operations preserve the prior distribution, while minimally modifying the original operation.

Non-Parametric Priors For Generative Adversarial Networks

- Computer ScienceICML
- 2019

It is demonstrated that the designed prior helps improve image generation along any Euclidean straight line during interpolation, both qualitatively and quantitatively, without any additional training or architectural modifications.

Generative adversarial interpolative autoencoding: adversarial training on latent space interpolations encourage convex latent distributions

- Computer Science, MathematicsArXiv
- 2018

A neural network architecture based upon the Autoencoder and Generative Adversarial Network that promotes a convex latent distribution by training adversarially on latent space interpolations to preserve realistic resemblances to the network inputs is presented.

OPTIMAL TRANSPORT MAPS FOR DISTRIBUTION PRE- SERVING OPERATIONS ON LATENT SPACES OF GENER-

- 2018

Generative models such as Variational Auto Encoders (VAEs) and Generative Adversarial Networks (GANs) are typically trained for a fixed prior distribution in the latent space, such as uniform or…

Mixture Density Generative Adversarial Networks

- Computer Science, Mathematics2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
- 2019

The ability to avoid mode collapse and discover all the modes and superior quality of the generated images (as measured by the Fréchet Inception Distance) are demonstrated, achieving the lowest FID compared to all baselines.

NeurInt : Learning to Interpolate through Neural ODEs

- Computer ScienceArXiv
- 2021

This work proposes a novel generative model that learns a flexible non-parametric prior over interpolation trajectories, conditioned on a pair of source and target images, using Latent SecondOrder Neural Ordinary Differential Equations.

On Latent Distributions Without Finite Mean in Generative Models

- Computer Science, MathematicsArXiv
- 2018

This work revolves around the phenomena arising while decoding linear interpolations between two random latent vectors -- regions of latent space in close proximity to the origin of the space are sampled causing distribution mismatch, and it is shown that due to the Central Limit Theorem, this region is almost never sampled during the training process.

Generating In-Between Images Through Learned Latent Space Representation Using Variational Autoencoders

- Computer ScienceIEEE Access
- 2020

It is demonstrated that the proposed method for image interpolation based on latent representations outperforms both pixel-based methods and a conventional variational autoencoder, with particular improvements in nonsuccessive images.

Evolutionary Latent Space Exploration of Generative Adversarial Networks

- Computer ScienceEvoApplications
- 2020

This paper focuses on the generation of sets of diverse examples by searching in the latent space using Genetic Algorithms and Map Elites, and compares the implemented approaches with the traditional approach.

## References

SHOWING 1-10 OF 16 REFERENCES

Generative Adversarial Nets

- Computer ScienceNIPS
- 2014

We propose a new framework for estimating generative models via an adversarial process, in which we simultaneously train two models: a generative model G that captures the data distribution, and a…

Adversarially Learned Inference

- Computer Science, MathematicsICLR
- 2017

The adversarially learned inference (ALI) model is introduced, which jointly learns a generation network and an inference network using an adversarial process and the usefulness of the learned representations is confirmed by obtaining a performance competitive with state-of-the-art on the semi-supervised SVHN and CIFAR10 tasks.

Discriminative Regularization for Generative Models

- Mathematics, Computer ScienceArXiv
- 2016

It is shown that enhancing the objective function of the variational autoencoder, a popular generative model, with a discriminative regularization term leads to samples that are clearer and have higher visual quality than the samples from the standard variatory autoencoders.

Autoencoding beyond pixels using a learned similarity metric

- Computer Science, MathematicsICML
- 2016

An autoencoder that leverages learned representations to better measure similarities in data space is presented and it is shown that the method learns an embedding in which high-level abstract visual features (e.g. wearing glasses) can be modified using simple arithmetic.

Auto-Encoding Variational Bayes

- Mathematics, Computer ScienceICLR
- 2014

A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.

Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks

- Computer Science, MathematicsICLR
- 2016

This work introduces a class of CNNs called deep convolutional generative adversarial networks (DCGANs), that have certain architectural constraints, and demonstrates that they are a strong candidate for unsupervised learning.

PANDA: Pose Aligned Networks for Deep Attribute Modeling

- Computer Science2014 IEEE Conference on Computer Vision and Pattern Recognition
- 2014

A new method which combines part-based models and deep learning by training pose-normalized CNNs for inferring human attributes from images of people under large variation of viewpoint, pose, appearance, articulation and occlusion is proposed.

Deep Visual Analogy-Making

- Computer ScienceNIPS
- 2015

A novel deep network trained end-to-end to perform visual analogy making, which is the task of transforming a query image according to an example pair of related images, is developed.

Automatic Chemical Design Using a Data-Driven Continuous Representation of Molecules

- Computer Science, PhysicsACS central science
- 2018

We report a method to convert discrete representations of molecules to and from a multidimensional continuous representation. This model allows us to generate new molecules for efficient exploration…

Learning SURF Cascade for Fast and Accurate Object Detection

- Computer Science2013 IEEE Conference on Computer Vision and Pattern Recognition
- 2013

A novel learning framework for training boosting cascade based object detector from large scale dataset derived from the well-known Viola-Jones (VJ) framework that can train object detectors from billions of negative samples within one hour even on personal computers.