Skip to search form
Skip to main content
Skip to account menu
Semantic Scholar
Semantic Scholar's Logo
Search 218,031,209 papers from all fields of science
Search
Sign In
Create Free Account
Rectifier (neural networks)
Known as:
RELU
, Rectifier (disambiguation)
, Softplus
Expand
In the context of artificial neural networks, the rectifier is an activation function defined as where x is the input to a neuron. This is also known…
Expand
Wikipedia
(opens in a new tab)
Create Alert
Alert
Related topics
Related topics
18 relations
Activation function
Artificial neural network
Backpropagation
Computer vision
Expand
Papers overview
Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2019
Highly Cited
2019
U-GAT-IT: Unsupervised Generative Attentional Networks with Adaptive Layer-Instance Normalization for Image-to-Image Translation
Junho Kim
,
Minjae Kim
,
Hyeonwoo Kang
,
Kwanghee Lee
International Conference on Learning…
2019
Corpus ID: 198895601
We propose a novel method for unsupervised image-to-image translation, which incorporates a new attention module and a new…
Expand
Highly Cited
2018
Highly Cited
2018
Bi-Real Net: Enhancing the Performance of 1-bit CNNs With Improved Representational Capability and Advanced Training Algorithm
Zechun Liu
,
Baoyuan Wu
,
Wenhan Luo
,
Xin Yang
,
W. Liu
,
K. Cheng
European Conference on Computer Vision
2018
Corpus ID: 51892264
In this work, we study the 1-bit convolutional neural networks (CNNs), of which both the weights and activations are binary…
Expand
Highly Cited
2018
Highly Cited
2018
Towards Fast Computation of Certified Robustness for ReLU Networks
Tsui-Wei Weng
,
Huan Zhang
,
+5 authors
L. Daniel
International Conference on Machine Learning
2018
Corpus ID: 13750928
Verifying the robustness property of a general Rectified Linear Unit (ReLU) network is an NP-complete problem [Katz, Barrett…
Expand
Highly Cited
2017
Highly Cited
2017
The Expressive Power of Neural Networks: A View from the Width
Zhou Lu
,
Hongming Pu
,
Feicheng Wang
,
Zhiqiang Hu
,
Liwei Wang
Neural Information Processing Systems
2017
Corpus ID: 3235741
The expressive power of neural networks is important for understanding deep learning. Most existing works consider this problem…
Expand
Highly Cited
2017
Highly Cited
2017
Swish: a Self-Gated Activation Function
Prajit Ramachandran
,
Barret Zoph
,
Quoc V. Le
2017
Corpus ID: 196158220
The choice of activation functions in deep networks has a significant effect on the training dynamics and task performance…
Expand
Highly Cited
2016
Highly Cited
2016
Why Deep Neural Networks for Function Approximation?
Shiyu Liang
,
R. Srikant
International Conference on Learning…
2016
Corpus ID: 7242855
Recently there has been much interest in understanding why deep neural networks are preferred to shallow networks. We show that…
Expand
Highly Cited
2016
Highly Cited
2016
Understanding Deep Neural Networks with Rectified Linear Units
R. Arora
,
A. Basu
,
Poorya Mianjy
,
Anirbit Mukherjee
Electron. Colloquium Comput. Complex.
2016
Corpus ID: 3482308
In this paper we investigate the family of functions representable by deep neural networks (DNN) with rectified linear units…
Expand
Highly Cited
2016
Highly Cited
2016
Convolutional Rectifier Networks as Generalized Tensor Decompositions
Nadav Cohen
,
A. Shashua
International Conference on Machine Learning
2016
Corpus ID: 7369238
Convolutional rectifier networks, i.e. convolutional neural networks with rectified linear activation and max or average pooling…
Expand
Highly Cited
1988
Highly Cited
1988
Satellite radar interferometry: Two-dimensional phase unwrapping
R. Goldstein
,
H. Zebker
,
C. Werner
1988
Corpus ID: 56427653
Interferometric synthetic aperture radar observations provide a means for obtaining high-resolution digital topographic maps from…
Expand
Highly Cited
1963
Highly Cited
1963
SERIES RESISTANCE EFFECTS ON SOLAR CELL MEASUREMENTS
M. Wolf
,
H. Rauschenbach
1963
Corpus ID: 111030769
By clicking accept or continuing to use the site, you agree to the terms outlined in our
Privacy Policy
(opens in a new tab)
,
Terms of Service
(opens in a new tab)
, and
Dataset License
(opens in a new tab)
ACCEPT & CONTINUE