1

The below is quoted from this post: When is logistic regression solved in closed form?

"In OLS, you have $$ \sum_i (y_i - x_i' \beta)^2 \to \min_\beta, $$ which has the first order conditions $$ -2 \sum_i (y_i - x_i'\beta) x_i = 0 $$"

I'm not sure what this syntax and terminology means. I think the right arrow means that we want to minimize $\beta$? And what is the purpose of the "first order condition" equal to 0? Sorry the silly questions.

xojfqa
  • 157
  • The top syntax is idiosyncratic but its meaning is apparent. The purpose of the first order condition is that it gives a very simple way to find the solution. https://stats.stackexchange.com/questions/1829 explains this generally. "First order" is a concept of differential calculus, which is well worth learning. – whuber Apr 01 '22 at 12:57
  • Hi @xojfqa. You'll see first order conditions (FOCs) used frequently in economics and econometrics. Check out some principles of econometrics texts, and you'll see this type of derivation. But to answer you immediate question, the FOC = 0 reflects the fact that to solve optimally leaves nothing left over. – EB3112 Apr 01 '22 at 13:24
  • @EB3112 But is it guaranteed that if $-2 \sum_i (y_i - x_i'\beta) x_i = 0$ when we "have nothing left over"? Couldn't we be at a local minima in the cost function so the slope is zero despite being able to go lower? – xojfqa Apr 01 '22 at 20:30
  • @whuber Is it correct to say that the first order condition here is when the slope is 0? If so, doesn't that assume that there is one global minima and there are no "valleys" where the slope is 0 despite there being a lower cost value to achieve? – xojfqa Apr 01 '22 at 20:31
  • It assumes nothing: it's just a criterion that must be satisfied at a global minimum. But because the objective function is positive semidefinite, any point where this condition holds is necessarily a global minimum. There may be multiple such points, though. – whuber Apr 01 '22 at 21:45

0 Answers0