Corpus ID: 211677960

Soft-Root-Sign Activation Function

  title={Soft-Root-Sign Activation Function},
  author={Y. Zhou and Dandan Li and Shuwei Huo and S. Kung},
  • Y. Zhou, Dandan Li, +1 author S. Kung
  • Published 2020
  • Computer Science
  • ArXiv
  • The choice of activation function in deep networks has a significant effect on the training dynamics and task performance. At present, the most effective and widely-used activation function is ReLU. However, because of the non-zero mean, negative missing and unbounded output, ReLU is at a potential disadvantage during optimization. To this end, we introduce a novel activation function to manage to overcome the above three challenges. The proposed nonlinearity, namely "Soft-Root-Sign" (SRS), is… CONTINUE READING
    3 Citations


    Mish: A Self Regularized Non-Monotonic Neural Activation Function
    • 96
    • Highly Influential
    Searching for Activation Functions
    • 728
    • Highly Influential
    Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)
    • 2,609
    • Highly Influential
    • PDF
    The Quest for the Golden Activation Function
    • 16
    • PDF
    Mish: A Self Regularized Non-Monotonic Activation Function
    • 33
    • PDF
    FReLU: Flexible Rectified Linear Units for Improving Convolutional Neural Networks
    • Suo Qiu, B. Cai
    • Computer Science
    • 2018 24th International Conference on Pattern Recognition (ICPR)
    • 2018
    • 17
    • PDF
    Improving Deep Neural Network with Multiple Parametric Exponential Linear Units
    • 34
    • PDF
    Empirical Evaluation of Rectified Activations in Convolutional Network
    • 1,193
    • Highly Influential
    • PDF
    Improving Deep Learning by Inverse Square Root Linear Units (ISRLUs)
    • 18
    • PDF