0

If a gradient points towards a max or a min what stops gradient descent from maximizing error instead of minimizing it?

Is it the nature of the update step that makes this process one way?

1 Answers1

0

In gradient decent, we are following the negative gradient direction, where the objective function will decrease instead of increase.

Haitao Du
  • 36,852
  • 25
  • 145
  • 242