Can ReLU handle a negative input?

I'm training a neural network on data that comes in as negative & positive values.

Is there any way to feed the data into a ReLU network without converting it all to positive and having a separate input which says if the data is negative or positive?

The problem I see is that a negative input at the input layer means that unless you have initialised your weights to be negative, the ReLU node isn't ever activated and is forever dead.


I'm not really 100% sure what you're asking, as there are many activation functions and you can easy code your own. If you dont want to code your own, maybe try some alternatives:

Leaky ReLU

Parameteric ReLU

Basically, take a look here 在这里输入图像描述


If you really use an activation function with the input layer, I would suggest either using another activation function like ELU or transform your data to the range [0,1], for example. If the ReLU function is in some hidden layer, the ReLU function should become dead only temporarily.

Suppose you have a ReLU function in the last hidden layer of a feed-forward network. With the backpropagation algorithm it should be possible that the outputs of the previous hidden layers are changed in such a way that, eventually, the input to the ReLU function will become positive again. Then the ReLU would not be dead anymore. Chances are that I am missing something here.

Anyway, you should definitely give ELU a try! I have experienced better results with it than with the ReLU.

链接地址: http://www.djcxy.com/p/96040.html

上一篇: 如何控制AVAssetWriter以正确的FPS编写

下一篇: ReLU可以处理负面的输入吗?