1

This might be a newbie question, but it is from a newbie.

If there are multiple local minimums, and the function converges at various local minima, which local minima to pick for optimization? Do we keep calculating multiple local minima to find the lowest one? How do we know if there is a finite set of local minimas?

Rahul
  • 111
  • In some sense, most of the theory of optimization is a response to the issues indicated here. Do you have a specific problem in mind? – whuber Sep 24 '22 at 19:19
  • 1
    It depends a bit on what you try to achieve, but often you probably want to go for the global minimum. In that case, gradient descent alone is likely not sufficient, but you should have a look at global optimisation. – Jonas Sep 24 '22 at 22:39
  • Is using stochastic gradient descent a solution for it? I was reading a book by Geron and some documentation about using SGD, to remove the local minimum issue. But I still trying to understand why taking mini-batches actually solve the issue. – Rahul Mar 09 '23 at 17:13

0 Answers0