• Corpus ID: 160010126

Compression with Flows via Local Bits-Back Coding

@article{Ho2019CompressionWF,
  title={Compression with Flows via Local Bits-Back Coding},
  author={Jonathan Ho and Evan Lohn and P. Abbeel},
  journal={ArXiv},
  year={2019},
  volume={abs/1905.08500}
}
Likelihood-based generative models are the backbones of lossless compression, due to the guaranteed existence of codes with lengths close to negative log likelihood. [] Key Method To fill in this gap, we introduce local bits-back coding, a new compression technique compatible with flow models. We present efficient algorithms that instantiate our technique for many popular types of flows, and we demonstrate that our algorithms closely achieve theoretical codelengths for state-of-the-art flow models on high…

Figures and Tables from this paper

Insights from Generative Modeling for Neural Video Compression
TLDR
This work presents recent neural video codecs as instances of a generalized stochastic temporal autoregressive transform, and proposes several architectures that yield state-of-the-art video compression performance on full-resolution video and discusses their tradeoffs and ablations.
Discrete Denoising Flows
TLDR
This paper introduces a new discrete flow-based model for categorical random variables: Discrete Denoising Flows (DDFs), which can be locally trained without introducing gradient bias.
HiLLoC: Lossless Image Compression with Hierarchical Latent Variable Models
TLDR
Full convolutional VAE models trained on 32x32 ImageNet can generalize well, not just to 64x64 but also to far larger photographs, with no changes to the model, achieving state of the art for compression of full size ImageNet images.
OSOA: One-Shot Online Adaptation of Deep Generative Models for Lossless Compression
TLDR
This work proposes a novel setting that starts from a pretrained deep generative model and compresses the data batches while adapting the model with a dynamical system for only one epoch and formalises this setting as that of One-Shot Online Adaptation (OSOA) of DGMs for lossless compression.
iVPF: Numerical Invertible Volume Preserving Flow for Efficient Lossless Compression
TLDR
This paper proposes Numerical Invertible Volume Preserving Flow (iVPF) which is derived from the general volume preserving flows and shows that a bijective mapping without error is possible and proposes a lossless compression algorithm based on iVPF.
iFlow: Numerically Invertible Flows for Efficient Lossless Compression via a Uniform Coder
TLDR
iFlow, a new method for achieving efficient lossless compression using normalizing flows which achieves state-of-the-art compression ratios and is 5× quicker than other high-performance schemes, is introduced.
Latent Variable Models
TLDR
Full convolutional VAE models trained on 32×32 ImageNet can generalize well, not just to 64×64 but also to far larger photographs, with no changes to the model, achieving state of the art for compression of full size ImageNet images.
Variational Diffusion Models
TLDR
A family of diffusion-based generative models that obtain state-of-the-art likelihoods on standard image density estimation benchmarks are introduced, and it is shown how to use the model as part of a bits-back compression scheme, and demonstrate lossless compression rates close to the theoretical optimum.
An Introduction to Neural Data Compression
TLDR
This introduction hopes to fill in the necessary background by reviewing basic coding topics such as entropy coding and rate-distortion theory, related machine learning ideas such as bits-back coding and perceptual metrics, and providing a guide through the representative works in the literature so far.
Lossless Image Compression Using a Multi-scale Progressive Statistical Model
TLDR
This paper has developed a flexible mechanism where the processing order of the pixels can be adjusted easily and outperforms the state-of-the-art lossless image compression methods on two large benchmark datasets by a significant margin.
...
1
2
3
...

References

SHOWING 1-10 OF 48 REFERENCES
Practical Lossless Compression with Latent Variables using Bits Back Coding
TLDR
Bits Back with ANS (BB-ANS) is presented, a scheme to perform lossless compression with latent variable models at a near optimal rate and it is concluded that with a sufficiently high quality generative model this scheme could be used to achieve substantial improvements in compression rate with acceptable running time.
Variational image compression with a scale hyperprior
TLDR
It is demonstrated that this model leads to state-of-the-art image compression when measuring visual quality using the popular MS-SSIM index, and yields rate-distortion performance surpassing published ANN-based methods when evaluated using a more traditional metric based on squared error (PSNR).
Bit-Swap: Recursive Bits-Back Coding for Lossless Compression with Hierarchical Latent Variables
TLDR
Bit-Swap is proposed, a new compression scheme that generalizes BB-ANS and achieves strictly better compression rates for hierarchical latent variable models with Markov chain structure and results in lossless compression rates that are empirically superior to existing techniques.
Lossy Image Compression with Compressive Autoencoders
TLDR
It is shown that minimal changes to the loss are sufficient to train deep autoencoders competitive with JPEG 2000 and outperforming recently proposed approaches based on RNNs, and furthermore computationally efficient thanks to a sub-pixel architecture, which makes it suitable for high-resolution images.
Glow: Generative Flow with Invertible 1x1 Convolutions
TLDR
Glow, a simple type of generative flow using an invertible 1x1 convolution, is proposed, demonstrating that a generative model optimized towards the plain log-likelihood objective is capable of efficient realistic-looking synthesis and manipulation of large images.
Flow++: Improving Flow-Based Generative Models with Variational Dequantization and Architecture Design
TLDR
Flow++ is proposed, a new flow-based model that is now the state-of-the-art non-autoregressive model for unconditional density estimation on standard image benchmarks, and has begun to close the significant performance gap that has so far existed between autoregressive models and flow- based models.
Variational learning and bits-back coding: an information-theoretic view to Bayesian learning
TLDR
The problem of variational Bayesian learning of hierarchical latent variable models is used to demonstrate the benefits of the two views and the code-length interpretation provides new views to many parts of the problem such as model comparison and pruning and helps explain many phenomena occurring in learning.
End-to-end Optimized Image Compression
TLDR
Across an independent set of test images, it is found that the optimized method generally exhibits better rate-distortion performance than the standard JPEG and JPEG 2000 compression methods, and a dramatic improvement in visual quality is observed, supported by objective quality estimates using MS-SSIM.
Practical Full Resolution Learned Lossless Image Compression
TLDR
The first practical learned lossless image compression system, L3C, is proposed and it outperforms the popular engineered codecs, PNG, WebP and JPEG 2000, and finds that learning the auxiliary representation is crucial and outperforms predefined auxiliary representations such as an RGB pyramid significantly.
Generating Long Sequences with Sparse Transformers
TLDR
This paper introduces sparse factorizations of the attention matrix which reduce this to $O(n)$, and generates unconditional samples that demonstrate global coherence and great diversity, and shows it is possible in principle to use self-attention to model sequences of length one million or more.
...
1
2
3
4
5
...