Share This Author
Score-Based Generative Modeling through Stochastic Differential Equations
- Yang Song, Jascha Narain Sohl-Dickstein, Diederik P. Kingma, Abhishek Kumar, S. Ermon, Ben Poole
- Computer ScienceICLR
- 26 November 2020
This work presents a stochastic differential equation (SDE) that smoothly transforms a complex data distribution to a known prior distribution by slowly injecting noise, and a corresponding reverse-time SDE that transforms the prior distribution back into the data distribution by Slowly removing the noise.
Generative Modeling by Estimating Gradients of the Data Distribution
A new generative model where samples are produced via Langevin dynamics using gradients of the data distribution estimated with score matching, which allows flexible model architectures, requires no sampling during training or the use of adversarial methods, and provides a learning objective that can be used for principled model comparisons.
PixelDefend: Leveraging Generative Models to Understand and Defend against Adversarial Examples
Adversarial perturbations of normal images are usually imperceptible to humans, but they can seriously confuse state-of-the-art machine learning models. What makes them so special in the eyes of…
Improved Techniques for Training Score-Based Generative Models
This work provides a new theoretical analysis of learning and sampling from score models in high dimensional spaces, explaining existing failure modes and motivating new solutions that generalize across datasets.
Efficient Graph Generation with Graph Recurrent Attention Networks
A new family of efficient and expressive deep generative models of graphs, called Graph Recurrent Attention Networks (GRANs), which better captures the auto-regressive conditioning between the already-generated and to-be-generated parts of the graph using Graph Neural Networks (GNNs) with attention.
Constructing Unrestricted Adversarial Examples with Generative Models
The empirical results on the MNIST, SVHN, and CelebA datasets show that unrestricted adversarial examples can bypass strong adversarial training and certified defense methods designed for traditional adversarial attacks.
Maximum Likelihood Training of Score-Based Diffusion Models
It is shown that for a speciﬁc weighting scheme, the objective upper bounds the negative log-likelihood, thus enabling approximate maximum likelihood training of score-based diffusion models.
Sliced Score Matching: A Scalable Approach to Density and Score Estimation
It is demonstrated that sliced score matching can learn deep energy-based models effectively, and can produce accurate score estimates for applications such as variational inference with implicit distributions and training Wasserstein Auto-Encoders.
Training Deep Neural Networks via Direct Loss Minimization
This paper proposes a direct loss minimization approach to train deep neural networks, which provably minimizes the application-specific loss function, and develops a novel dynamic programming algorithm that can efficiently compute the weight updates.
SDEdit: Image Synthesis and Editing with Stochastic Differential Equations
A new image editing and synthesis framework, SDEdit, based on a recent generative model using stochastic differential equations (SDEs), which achieves strong performance on a wide range of applications, including image synthesis and editing guided by stroke paintings and image compositing.