Corpus ID: 209318141

Evaluating Lossy Compression Rates of Deep Generative Models

@inproceedings{Huang2020EvaluatingLC,
  title={Evaluating Lossy Compression Rates of Deep Generative Models},
  author={Sicong Huang and Alireza Makhzani and Yanshuai Cao and Roger B. Grosse},
  booktitle={ICML},
  year={2020}
}
The field of deep generative modeling has succeeded in producing astonishingly realistic-seeming images and audio, but quantitative evaluation remains a challenge. Log-likelihood is an appealing metric due to its grounding in statistics and information theory, but it can be challenging to estimate for implicit generative models, and scalar-valued metrics give an incomplete picture of a model's quality. In this work, we propose to use rate distortion (RD) curves to evaluate and compare deep… Expand
5 Citations
Denoising Diffusion Probabilistic Models
  • 47
  • PDF
All in the Exponential Family: Bregman Duality in Thermodynamic Variational Inference
  • 3
  • PDF
Likelihood Ratio Exponential Families
  • 1
  • PDF
Annealed Flow Transport Monte Carlo
  • PDF

References

SHOWING 1-10 OF 53 REFERENCES
On the Quantitative Analysis of Decoder-Based Generative Models
  • 175
  • PDF
Assessing Generative Models via Precision and Recall
  • 133
  • PDF
Practical Lossless Compression with Latent Variables using Bits Back Coding
  • 33
  • PDF
An empirical study on evaluation metrics of generative adversarial networks
  • 103
  • PDF
Large Scale GAN Training for High Fidelity Natural Image Synthesis
  • 1,674
  • PDF
Improved Techniques for Training GANs
  • 4,284
  • Highly Influential
  • PDF
A note on the evaluation of generative models
  • 695
  • Highly Influential
  • PDF
Are GANs Created Equal? A Large-Scale Study
  • 534
  • PDF
Variational image compression with a scale hyperprior
  • 313
  • PDF
...
1
2
3
4
5
...