TinyGAN: Distilling BigGAN for Conditional Image Generation
@inproceedings{Chang2020TinyGANDB, title={TinyGAN: Distilling BigGAN for Conditional Image Generation}, author={Ting-Yun Chang and Chi-Jen Lu}, booktitle={Asian Conference on Computer Vision}, year={2020} }
Generative Adversarial Networks (GANs) have become a powerful approach for generative image modeling. However, GANs are notorious for their training instability, especially on large-scale, complex datasets. While the recent work of BigGAN has significantly improved the quality of image generation on ImageNet, it requires a huge model, making it hard to deploy on resource-constrained devices. To reduce the model size, we propose a black-box knowledge distillation framework for compressing GANs…
14 Citations
Online Multi-Granularity Distillation for GAN Compression
- Computer Science2021 IEEE/CVF International Conference on Computer Vision (ICCV)
- 2021
This work proposes a novel online multi-granularity distillation (OMGD) scheme to obtain lightweight GANs, which contributes to generating highfidelity images with low computational demands and reveals that OMGD provides a feasible solution for the deployment of real-time image translation on resource-constrained devices.
Microdosing: Knowledge Distillation for GAN based Compression
- Computer ScienceBMVC
- 2021
This paper demonstrates how to leverage knowledge distillation to obtain equally capable image decoders at a fraction of the original number of parameters and investigates several aspects of the solution including sequence specialization with side information for image coding.
Content-Aware GAN Compression
- Computer Science2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
- 2021
A novel content-aware method is proposed to guide the processes of both pruning and distillation of unconditional GANs, which forms a smoother and better disentangled latent manifold, making it more effective for image editing.
PPCD-GAN: Progressive Pruning and Class-Aware Distillation for Large-Scale Conditional GANs Compression
- Computer Science2022 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)
- 2022
A gradually shrinking GAN (PPCD-GAN) is proposed by introducing progressive pruning residual block (PP-Res) and class-aware distillation and enhances the stability of training by transferring immense knowledge from a well-trained teacher model through instructive attention maps.
SphericGAN: Semi-supervised Hyper-spherical Generative Adversarial Networks for Fine-grained Image Synthesis
- Computer Science2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
- 2022
This work proposes a semi-supervised hyper-spherical GAN for class-conditional fine-grained image generation, and its model is referred to as SphericGAN, which achieves state-of-the-art performance in synthesizing high-fidelity images with precise class semantics.
HELMINGER ET AL.: KNOWLEDGE DISTILLATION FOR GAN BASED COMPRESSION 1 Microdosing: Knowledge Distillation for GAN based Compression
- Computer Science
- 2021
This paper demonstrates how to leverage knowledge distillation to obtain equally capable image decoders at a fraction of the original number of parameters and investigates several aspects of the solution including sequence specialization with side information for image coding.
The Hidden Tasks of Generative Adversarial Networks: An Alternative Perspective on GAN Training
- Computer ScienceArXiv
- 2021
It is shown that the training step for a GAN generator decomposes into two implicit sub-problems, which are used as targets to update the generator via leastsquares regression, regardless of the main loss specified to train the network.
Mind the Gap in Distilling StyleGANs
- Computer ScienceECCV
- 2022
This paper provides a comprehensive study of distilling from the popular StyleGAN-like architecture, and proposes a novel initialization strategy for the student model, which can ensure the output consistency to the maximum extent.
StyleGAN-XL: Scaling StyleGAN to Large Diverse Datasets
- Computer ScienceSIGGRAPH
- 2022
The final model, StyleGAN-XL, sets a new state-of-the-art on large-scale image synthesis and is the first to generate images at a resolution of 10242 at such a dataset scale.
SKDCGN: Source-free Knowledge Distillation of Counterfactual Generative Networks using cGANs
- Computer ScienceECCV Workshops
- 2022
This work addresses the following question: can the knowledge embedded in pre-trained CGNs be used to train a lower-capacity model, assuming black-box access to the pretrained CGN model, and proposes a novel work named SKDCGN that attempts knowledge transfer using Knowledge Distillation (KD).
References
SHOWING 1-10 OF 39 REFERENCES
Compressing GANs using Knowledge Distillation
- Computer ScienceArXiv
- 2019
Training an over-parameterized GAN followed by this proposed compression scheme provides a high quality generative model with a small number of parameters, and it is conjecture that this is partially owing to the optimization landscape of over- ParameterizedGANs which allows efficient training using alternating gradient descent.
Improved Training of Wasserstein GANs
- Computer ScienceNIPS
- 2017
This work proposes an alternative to clipping weights: penalize the norm of gradient of the critic with respect to its input, which performs better than standard WGAN and enables stable training of a wide variety of GAN architectures with almost no hyperparameter tuning.
Mode Regularized Generative Adversarial Networks
- Computer ScienceICLR
- 2017
This work introduces several ways of regularizing the objective, which can dramatically stabilize the training of GAN models and shows that these regularizers can help the fair distribution of probability mass across the modes of the data generating distribution, during the early phases of training and thus providing a unified solution to the missing modes problem.
Large Scale GAN Training for High Fidelity Natural Image Synthesis
- Computer ScienceICLR
- 2019
It is found that applying orthogonal regularization to the generator renders it amenable to a simple "truncation trick," allowing fine control over the trade-off between sample fidelity and variety by reducing the variance of the Generator's input.
Self-Attention Generative Adversarial Networks
- Computer ScienceICML
- 2019
The proposed SAGAN achieves the state-of-the-art results, boosting the best published Inception score from 36.8 to 52.52 and reducing Frechet Inception distance from 27.62 to 18.65 on the challenging ImageNet dataset.
Progressive Growing of GANs for Improved Quality, Stability, and Variation
- Computer ScienceICLR
- 2018
A new training methodology for generative adversarial networks is described, starting from a low resolution, and adding new layers that model increasingly fine details as training progresses, allowing for images of unprecedented quality.
Image-to-Image Translation with Conditional Adversarial Networks
- Computer Science2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
- 2017
Conditional adversarial networks are investigated as a general-purpose solution to image-to-image translation problems and it is demonstrated that this approach is effective at synthesizing photos from label maps, reconstructing objects from edge maps, and colorizing images, among other tasks.
High-Resolution Image Synthesis and Semantic Manipulation with Conditional GANs
- Computer Science2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition
- 2018
A new method for synthesizing high-resolution photo-realistic images from semantic label maps using conditional generative adversarial networks (conditional GANs) is presented, which significantly outperforms existing methods, advancing both the quality and the resolution of deep image synthesis and editing.
Energy-based Generative Adversarial Network
- Computer ScienceArXiv
- 2016
We introduce the "Energy-based Generative Adversarial Network" model (EBGAN) which views the discriminator as an energy function that attributes low energies to the regions near the data manifold and…