0

The Matrix/vector formulation of OLS is given by

$$\vec \beta = \left( X^T X \right)^{-1} X^T \vec y$$

which is very nice as a matrix equation.

But what is the equivalent scalar equation for a single entry of $\vec \beta$ in the above equality?

Matrix multiplication is straightforward enough as a series, but I am somewhat at a loss when I get to the matrix inversion. The way I learned matrix inversion was to setup a particular augmented matrix and put it into reduced echelon form. I don't know how to convert that to a series expression. And maybe there isn't one, but I thought I would ask.

Galen
  • 8,442
  • A "series expression" of what type, in what variables? For instance, see https://en.wikipedia.org/wiki/Invertible_matrix#Derivative_of_the_matrix_inverse for a power series expansion. As far as explicit formulas go, it comes down to determinants, often called Cramer's Rule. – whuber Aug 08 '23 at 17:20
  • @whuber Your comment has prompted me to realize I don't care about having a series per se, but rather an explicit formula. Something where I can write a given component of $\vec \beta$ as an explicit formula of the entries in $X$ and $\vec y$. – Galen Aug 08 '23 at 17:23
  • 1
    That's exactly what Cramer's rule does: see the beginning of the article I linked to. For some commentary on this general question of explicit regression formulas, please see my post at https://stats.stackexchange.com/a/197788/919. Another approach is to obtain individual coefficients through a sequence of simple formulas. See https://stats.stackexchange.com/a/46508/919 for an account of this (which is equivalent to Gram-Schmidt orthogonalization). – whuber Aug 08 '23 at 17:25

0 Answers0