Skip to search formSkip to main contentSkip to account menu

Rectifier (neural networks)

Known as: RELU, Rectifier (disambiguation), Softplus 
In the context of artificial neural networks, the rectifier is an activation function defined as where x is the input to a neuron. This is also known… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2019
Highly Cited
2019
We propose a novel method for unsupervised image-to-image translation, which incorporates a new attention module and a new… 
Highly Cited
2018
Highly Cited
2018
In this work, we study the 1-bit convolutional neural networks (CNNs), of which both the weights and activations are binary… 
Highly Cited
2018
Highly Cited
2018
Verifying the robustness property of a general Rectified Linear Unit (ReLU) network is an NP-complete problem [Katz, Barrett… 
Highly Cited
2017
Highly Cited
2017
The expressive power of neural networks is important for understanding deep learning. Most existing works consider this problem… 
Highly Cited
2017
Highly Cited
2017
The choice of activation functions in deep networks has a significant effect on the training dynamics and task performance… 
Highly Cited
2016
Highly Cited
2016
Recently there has been much interest in understanding why deep neural networks are preferred to shallow networks. We show that… 
Highly Cited
2016
Highly Cited
2016
In this paper we investigate the family of functions representable by deep neural networks (DNN) with rectified linear units… 
Highly Cited
2016
Highly Cited
2016
Convolutional rectifier networks, i.e. convolutional neural networks with rectified linear activation and max or average pooling… 
Highly Cited
1988
Highly Cited
1988
Interferometric synthetic aperture radar observations provide a means for obtaining high-resolution digital topographic maps from… 
Highly Cited
1963