Regularization of Neural Networks using DropConnect

@inproceedings{Wan2013RegularizationON,
  title={Regularization of Neural Networks using DropConnect},
  author={Li Wan and Matthew D. Zeiler and Sixin Zhang and Yann LeCun and Rob Fergus},
  booktitle={ICML},
  year={2013}
}
We introduce DropConnect, a generalization of Dropout (Hinton et al., 2012), for regularizing large fully-connected layers within neural networks. When training with Dropout, a randomly selected subset of activations are set to zero within each layer. DropConnect instead sets a randomly selected subset of weights within the network to zero. Each unit thus… CONTINUE READING

9 Figures & Tables

Topics

Statistics

0200400201320142015201620172018
Citations per Year

1,145 Citations

Semantic Scholar estimates that this publication has 1,145 citations based on the available data.

See our FAQ for additional information.