Corpus ID: 211677960

Soft-Root-Sign Activation Function

@article{Zhou2020SoftRootSignAF,
  title={Soft-Root-Sign Activation Function},
  author={Y. Zhou and Dandan Li and Shuwei Huo and S. Kung},
  journal={ArXiv},
  year={2020},
  volume={abs/2003.00547}
}
  • Y. Zhou, Dandan Li, +1 author S. Kung
  • Published 2020
  • Computer Science
  • ArXiv
  • The choice of activation function in deep networks has a significant effect on the training dynamics and task performance. At present, the most effective and widely-used activation function is ReLU. However, because of the non-zero mean, negative missing and unbounded output, ReLU is at a potential disadvantage during optimization. To this end, we introduce a novel activation function to manage to overcome the above three challenges. The proposed nonlinearity, namely "Soft-Root-Sign" (SRS), is… CONTINUE READING
    3 Citations

    References

    SHOWING 1-10 OF 50 REFERENCES
    Mish: A Self Regularized Non-Monotonic Neural Activation Function
    • 96
    • Highly Influential
    Searching for Activation Functions
    • 728
    • Highly Influential
    Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)
    • 2,609
    • Highly Influential
    • PDF
    The Quest for the Golden Activation Function
    • 16
    • PDF
    Mish: A Self Regularized Non-Monotonic Activation Function
    • 33
    • PDF
    FReLU: Flexible Rectified Linear Units for Improving Convolutional Neural Networks
    • Suo Qiu, B. Cai
    • Computer Science
    • 2018 24th International Conference on Pattern Recognition (ICPR)
    • 2018
    • 17
    • PDF
    Improving Deep Neural Network with Multiple Parametric Exponential Linear Units
    • 34
    • PDF
    Empirical Evaluation of Rectified Activations in Convolutional Network
    • 1,193
    • Highly Influential
    • PDF
    Improving Deep Learning by Inverse Square Root Linear Units (ISRLUs)
    • 18
    • PDF