Corpus ID: 59158824

Backprop with Approximate Activations for Memory-efficient Network Training

@inproceedings{Chakrabarti2019BackpropWA,
  title={Backprop with Approximate Activations for Memory-efficient Network Training},
  author={A. Chakrabarti and B. Moseley},
  booktitle={NeurIPS},
  year={2019}
}
  • A. Chakrabarti, B. Moseley
  • Published in NeurIPS 2019
  • Computer Science, Mathematics
  • Training convolutional neural network models is memory intensive since back-propagation requires storing activations of all intermediate layers. This presents a practical concern when seeking to deploy very deep architectures in production, especially when models need to be frequently re-trained on updated datasets. In this paper, we propose a new implementation for back-propagation that significantly reduces memory usage, by enabling the use of approximations with negligible computational cost… CONTINUE READING
    Prediction Confidence based Low Complexity Gradient Computation for Accelerating DNN Training
    Don’t Waste Your Bits! Squeeze Activations and Gradients for Deep Neural Networks via TINYSCRIPT
    • 1
    • Highly Influenced
    • PDF
    Memory Optimization for Deep Networks

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 21 REFERENCES
    Training Deep Nets with Sublinear Memory Cost
    • 230
    • PDF
    The Reversible Residual Network: Backpropagation Without Storing Activations
    • 163
    • PDF
    Scalable Methods for 8-bit Training of Neural Networks
    • 71
    • PDF
    Mixed Precision Training
    • 333
    • PDF
    Deep Compression: Compressing Deep Neural Network with Pruning, Trained Quantization and Huffman Coding
    • 3,681
    • PDF
    Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
    • 18,924
    • PDF
    Densely Connected Convolutional Networks
    • 9,761
    • PDF
    Memory-Efficient Backpropagation Through Time
    • 92
    • PDF
    Deep Learning with Limited Numerical Precision
    • 1,080
    • Highly Influential
    • PDF