On the Selection of Initialization and Activation Function for Deep Neural Networks

@article{Hayou2018OnTS,
  title={On the Selection of Initialization and Activation Function for Deep Neural Networks},
  author={Soufiane Hayou and Arnaud Doucet and Judith Rousseau},
  journal={ArXiv},
  year={2018},
  volume={abs/1805.08266}
}
The weight initialization and the activation function of deep neural networks have a crucial impact on the performance of the training procedure. An inappropriate selection can lead to the loss of information of the input during forward propagation and the exponential vanishing/exploding of gradients during back-propagation. Understanding the theoretical properties of untrained random networks is key to identifying which deep networks may be trained successfully as recently demonstrated by… CONTINUE READING
27
Twitter Mentions

Citations

Publications citing this paper.
SHOWING 1-10 OF 12 CITATIONS

References

Publications referenced by this paper.
SHOWING 1-10 OF 16 REFERENCES

Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification

  • 2015 IEEE International Conference on Computer Vision (ICCV)
  • 2015
VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

Efficient BackProp

  • Neural Networks: Tricks of the Trade
  • 1998
VIEW 8 EXCERPTS
HIGHLY INFLUENTIAL