Corpus ID: 10413462

Deep Information Propagation

@article{Schoenholz2016DeepIP,
  title={Deep Information Propagation},
  author={Samuel S. Schoenholz and Justin Gilmer and Surya Ganguli and Jascha Sohl-Dickstein},
  journal={ArXiv},
  year={2016},
  volume={abs/1611.01232}
}
  • Samuel S. Schoenholz, Justin Gilmer, +1 author Jascha Sohl-Dickstein
  • Published in ICLR 2016
  • Mathematics, Computer Science
  • We study the behavior of untrained neural networks whose weights and biases are randomly distributed using mean field theory. We show the existence of depth scales that naturally limit the maximum depth of signal propagation through these random networks. Our main practical result is to show that random networks may be trained precisely when information can travel through them. Thus, the depth scales that we identify provide bounds on how deep a network may be trained for a specific choice of… CONTINUE READING

    Create an AI-powered research feed to stay up to date with new papers like this posted to ArXiv

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 89 CITATIONS

    A Signal Propagation Perspective for Pruning Neural Networks at Initialization

    VIEW 3 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    A Mean Field Theory of Batch Normalization

    VIEW 9 EXCERPTS
    CITES BACKGROUND & METHODS

    A Mean Field Theory of Quantized Deep Networks: The Quantization-Depth Trade-Off

    VIEW 10 EXCERPTS
    CITES BACKGROUND, RESULTS & METHODS
    HIGHLY INFLUENCED

    Characterizing Well-Behaved vs. Pathological Deep Neural Networks

    VIEW 11 EXCERPTS
    CITES BACKGROUND
    HIGHLY INFLUENCED

    Deep Learning Theory Review: An Optimal Control and Dynamical Systems Perspective

    VIEW 7 EXCERPTS
    CITES BACKGROUND & METHODS
    HIGHLY INFLUENCED

    Mean field theory for deep dropout networks: digging up gradient backpropagation deeply

    VIEW 9 EXCERPTS
    CITES BACKGROUND, METHODS & RESULTS
    HIGHLY INFLUENCED

    Critical initialisation for deep signal propagation in noisy rectifier neural networks

    VIEW 11 EXCERPTS
    CITES BACKGROUND & METHODS
    HIGHLY INFLUENCED

    Information Geometry of Orthogonal Initializations and Training

    VIEW 4 EXCERPTS
    CITES BACKGROUND & METHODS
    HIGHLY INFLUENCED

    FILTER CITATIONS BY YEAR

    2017
    2020

    CITATION STATISTICS

    • 21 Highly Influenced Citations

    • Averaged 25 Citations per year from 2017 through 2019

    • 153% Increase in citations per year in 2019 over 2018

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 16 REFERENCES

    Deep Residual Learning for Image Recognition

    VIEW 2 EXCERPTS