Corpus ID: 211572620

The Implicit and Explicit Regularization Effects of Dropout

@article{Wei2020TheIA,
  title={The Implicit and Explicit Regularization Effects of Dropout},
  author={Colin Wei and S. Kakade and Tengyu Ma},
  journal={ArXiv},
  year={2020},
  volume={abs/2002.12915}
}
Dropout is a widely-used regularization technique, often required to obtain state-of-the-art for a number of architectures. This work demonstrates that dropout introduces two distinct but entangled regularization effects: an explicit effect (also studied in prior work) which occurs since dropout modifies the expected training objective, and, perhaps surprisingly, an additional implicit effect from the stochasticity in the dropout training update. This implicit regularization effect is analogous… Expand
Shape Matters: Understanding the Implicit Bias of the Noise Covariance
Explicit Regularisation in Gaussian Noise Injections
Understanding the Role of Training Regimes in Continual Learning
Dropout as an Implicit Gating Mechanism For Continual Learning
On Mixup Regularization
How Does Mixup Help With Robustness and Generalization?
...
1
2
3
...

References

SHOWING 1-10 OF 73 REFERENCES
Dropout with Expectation-linear Regularization
Understanding Dropout
Surprising properties of dropout in deep networks
Dropout Training as Adaptive Regularization
Dropout: Explicit Forms and Capacity Control
On Dropout and Nuclear Norm Regularization
Fast dropout training
On the Implicit Bias of Dropout
On the inductive bias of dropout
Altitude Training: Strong Bounds for Single-Layer Dropout
...
1
2
3
4
5
...