Corpus ID: 220633468

Channel-wise Autoregressive Entropy Models for Learned Image Compression

@article{Minnen2020ChannelwiseAE,
  title={Channel-wise Autoregressive Entropy Models for Learned Image Compression},
  author={David Minnen and Saurabh Singh},
  journal={ArXiv},
  year={2020},
  volume={abs/2007.08739}
}
  • David Minnen, Saurabh Singh
  • Published 2020
  • Engineering, Computer Science, Mathematics
  • ArXiv
  • In learning-based approaches to image compression, codecs are developed by optimizing a computational model to minimize a rate-distortion objective. Currently, the most effective learned image codecs take the form of an entropy-constrained autoencoder with an entropy model that uses both forward and backward adaptation. Forward adaptation makes use of side information and can be efficiently integrated into a deep neural network. In contrast, backward adaptation typically makes predictions based… CONTINUE READING
    Nonlinear Transform Coding
    Lossy Compression with Distortion Constrained Optimization
    1

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 36 REFERENCES
    Multi-scale and Context-adaptive Entropy Model for Image Compression
    • J. Zhou
    • Engineering, Computer Science
    • 2019
    9
    Variational image compression with a scale hyperprior
    211
    Conditional Probability Models for Deep Image Compression
    143
    Improved Lossy Image Compression with Priming and Spatially Adaptive Bit Rates for Recurrent Networks
    139
    Joint Autoregressive and Hierarchical Priors for Learned Image Compression
    134
    Full Resolution Image Compression with Recurrent Neural Networks
    334
    End-to-end Optimized Image Compression
    370
    Lossy Image Compression with Compressive Autoencoders
    340
    Context-adaptive Entropy Model for End-to-end Optimized Image Compression
    56