Regularization of Neural Networks using DropConnect

  title={Regularization of Neural Networks using DropConnect},
  author={Li Wan and Matthew D. Zeiler and Sixin Zhang and Yann LeCun and Rob Fergus},
We introduce DropConnect, a generalization of Dropout (Hinton et al., 2012), for regularizing large fully-connected layers within neural networks. When training with Dropout, a randomly selected subset of activations are set to zero within each layer. DropConnect instead sets a randomly selected subset of weights within the network to zero. Each unit thus… CONTINUE READING

9 Figures & Tables



Citations per Year

1,145 Citations

Semantic Scholar estimates that this publication has 1,145 citations based on the available data.

See our FAQ for additional information.