As far as I understand, Signoid function is used for mapping the outputs of neural network to the values between 0 and 1. Why is using rectified linear unit(ReLU) as activation function in deep neural networks, works faster? Can you please explain the mathematical concept behind it?
Asked
Active
Viewed 82 times
maxin ReLU just compares two numbers, whileexpdoes a number of different computations. – Tim Jan 02 '19 at 14:28