Corpus ID: 19177691

Reversible Architectures for Arbitrarily Deep Residual Neural Networks

@inproceedings{Chang2018ReversibleAF,
  title={Reversible Architectures for Arbitrarily Deep Residual Neural Networks},
  author={Bo Chang and Lili Meng and Eldad Haber and Lars Ruthotto and David Begert and Elliot Holtham},
  booktitle={AAAI},
  year={2018}
}
  • Bo Chang, Lili Meng, +3 authors Elliot Holtham
  • Published in AAAI 2018
  • Computer Science, Mathematics
  • Recently, deep residual networks have been successfully applied in many computer vision and natural language processing tasks, pushing the state-of-the-art performance with deeper and wider architectures. In this work, we interpret deep residual networks as ordinary differential equations (ODEs), which have long been studied in mathematics and physics with rich theoretical and empirical success. From this interpretation, we develop a theoretical framework on stability and reversibility of deep… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    Paper Mentions

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 80 CITATIONS

    Layer-Parallel Training of Deep Residual Neural Networks

    VIEW 2 EXCERPTS
    CITES METHODS & BACKGROUND

    Fully Hyperbolic Convolutional Neural Networks

    VIEW 3 EXCERPTS
    CITES BACKGROUND & METHODS

    Beyond Finite Layer Neural Networks: Bridging Deep Architectures and Numerical Differential Equations

    VIEW 8 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Deep Limits of Residual Neural Networks.

    VIEW 1 EXCERPT
    CITES BACKGROUND

    State-Space Representations of Deep Neural Networks

    Identity Connections in Residual Nets Improve Noise Stability

    VIEW 1 EXCERPT
    CITES BACKGROUND

    Gradients explode - Deep Networks are shallow - ResNet explained

    VIEW 4 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Multi-level Residual Networks from Dynamical Systems View

    VIEW 7 EXCERPTS
    CITES BACKGROUND

    Robust learning with implicit residual networks

    VIEW 2 EXCERPTS
    CITES METHODS & BACKGROUND

    FILTER CITATIONS BY YEAR

    2017
    2020

    CITATION STATISTICS

    • 7 Highly Influenced Citations

    • Averaged 21 Citations per year from 2017 through 2019

    • 90% Increase in citations per year in 2019 over 2018

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 50 REFERENCES

    Stable Architectures for Deep Neural Networks

    VIEW 6 EXCERPTS

    Wide Residual Networks

    VIEW 2 EXCERPTS

    Deep Residual Learning for Image Recognition

    VIEW 16 EXCERPTS
    HIGHLY INFLUENTIAL

    Aggregated Residual Transformations for Deep Neural Networks

    VIEW 3 EXCERPTS