Cosine Normalization: Using Cosine Similarity Instead of Dot Product in Neural Networks

@inproceedings{Luo2018CosineNU,
  title={Cosine Normalization: Using Cosine Similarity Instead of Dot Product in Neural Networks},
  author={Chunjie Luo and Jianfeng Zhan and Xiaohe Xue and Lei Wang and Rui Ren and Qiang Yang},
  booktitle={ICANN},
  year={2018}
}
Traditionally, multi-layer neural networks use dot product between the output vector of previous layer and the incoming weight vector as the input to activation function. The result of dot product is unbounded, thus increases the risk of large variance. Large variance of neuron makes the model sensitive to the change of input distribution, thus results in poor generalization, and aggravates the internal covariate shift which slows down the training. To bound dot product and decrease the… CONTINUE READING
Highly Cited
This paper has 24 citations. REVIEW CITATIONS
Recent Discussions
This paper has been referenced on Twitter 107 times over the past 90 days. VIEW TWEETS

References

Publications referenced by this paper.

Similar Papers

Loading similar papers…