In the Wikipedia article Ordinary Linear Squares there is an example for finding the estimators $\beta_i$ for a linear model of the sort:
$$y_i = \beta_0 + x_1\beta_1 + x_2\beta_2 + \ldots$$
In the calculus side, using some objective function like the Sum of Square Residuals or Errors, is minimized in order to find the estimators.
However, nothing like partial derivatives or an objective function appears in the Matrix formulation of this problem linked above, and a more detailed version here.
Just for completeness, the Matrix equation is $X\beta = \mathbf{y}$, where solving for $\beta_i$'s seems somewhat obvious.
Questions
- Why is it the case that the calculus version needs extra procedures i.e:
- An Objective function,
- Finding the derivatives
- Can these two "processes" be found in the resolution or the equation (this does not seem to be the case) ?