Large Deviation Analysis of Function Sensitivity in Random Deep Neural Networks

@article{Li2019LargeDA,
  title={Large Deviation Analysis of Function Sensitivity in Random Deep Neural Networks},
  author={B. Li and D. Saad},
  journal={ArXiv},
  year={2019},
  volume={abs/1910.05769}
}
  • B. Li, D. Saad
  • Published 2019
  • Computer Science, Mathematics, Physics
  • ArXiv
  • Mean field theory has been successfully used to analyze deep neural networks (DNN) in the infinite size limit. Given the finite size of realistic DNN, we utilize the large deviation theory and path integral analysis to study the deviation of functions represented by DNN from their typical mean field solutions. The parameter perturbations investigated include weight sparsification (dilution) and binarization, which are commonly used in model simplification, for both ReLU and sign activation… CONTINUE READING
    4 Citations
    Mean-field inference methods for neural networks
    • 6
    • PDF
    The Space of Functions Computed By Deep Layered Machines
    • PDF
    Learning credit assignment
    • 2
    • PDF

    References

    SHOWING 1-10 OF 45 REFERENCES
    Avoiding pathologies in very deep networks
    • 102
    • PDF
    Finite size corrections for neural network Gaussian processes
    • 7
    • PDF
    Deep Information Propagation
    • 168
    • PDF
    Critical initialisation for deep signal propagation in noisy rectifier neural networks
    • 13
    • PDF
    Understanding deep learning requires rethinking generalization
    • 2,414
    • PDF
    Sensitivity and Generalization in Neural Networks: an Empirical Study
    • 198
    • PDF
    The High-Dimensional Geometry of Binary Neural Networks
    • 44
    • PDF