Corpus ID: 220302263

On Dropout, Overfitting, and Interaction Effects in Deep Neural Networks

  title={On Dropout, Overfitting, and Interaction Effects in Deep Neural Networks},
  author={Benjamin J. Lengerich and E. Xing and R. Caruana},
  • Benjamin J. Lengerich, E. Xing, R. Caruana
  • Published 2020
  • Computer Science, Mathematics
  • ArXiv
  • We examine Dropout through the perspective of interactions: learned effects that combine multiple input variables. Given $N$ variables, there are $O(N^2)$ possible pairwise interactions, $O(N^3)$ possible 3-way interactions, etc. We show that Dropout implicitly sets a learning rate for interaction effects that decays exponentially with the size of the interaction, corresponding to a regularizer that balances against the hypothesis space which grows exponentially with number of variables in the… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    Explore Further: Topics Discussed in This Paper


    Surprising properties of dropout in deep networks
    • 19
    • PDF
    Analysis on the Dropout Effect in Convolutional Neural Networks
    • 49
    • PDF
    Dropout as a Structured Shrinkage Prior
    • 14
    • PDF
    Dropout: a simple way to prevent neural networks from overfitting
    • 20,046
    • PDF
    Multiplicative Interactions and Where to Find Them
    • 13
    Adaptive dropout for training deep neural networks
    • 214
    • PDF
    Dropout Training as Adaptive Regularization
    • 399
    • PDF
    Do deep nets really need weight decay and dropout?
    • 18
    • PDF
    Deep & Cross Network for Ad Click Predictions
    • 229
    • PDF
    SGD on Neural Networks Learns Functions of Increasing Complexity
    • 30
    • PDF