2

So as far as I understand from watching 3Blue1Brown's video on neural networks, all neurons operate on numbers ranging from 0 to 1. Since a weighted sum goes larger than that, a sigmoid function is used to squish it back into that interval. How does a ReLU work then if it doesn't force numbers into that interval? Or is it used in networks operating in numbers that can be greater than 1?

Dave
  • 62,186
Yeepsta
  • 21
  • 1
    Welcome to Cross Validated! What makes you think that neurons only operate on values between zero and one? (I haven’t seen 3B1B’s video in a long time, so it might be that his entire discussion concerned sigmoid activation functions, which do squash into $(0,1)$.) – Dave Jul 07 '22 at 12:47

0 Answers0