11

Is there a speedier way to calculate standard errors for linear regression problems, than by inverting $X'X$? Here I assume we have regression:

$$y=X\beta+\varepsilon,$$

where $X$ is $n\times k$ matrix and $y$ is $n\times 1$ vector.

For finding least squares problem solution it is impractical to do anything with $X'X$, you can use QR or SVD decompositions on matrix $X$ directly. Or alternatively you can use gradient methods. But what about standard errors? We really only need the diagonal of $(X'X)^{-1}$ (and naturally LS solution to calculate the estimate of standard error of $\varepsilon$). Are there any specific methods for standard error calculation?

mpiktas
  • 265
  • 1
  • 8

1 Answers1

5

Let's suppose that you solved your least squares problem using the singular value decomposition (SVD) of $X$, given by

$$X = U\Sigma V',$$

where $U$ and $V$ are unitary, and $\Sigma$ is diagonal.

Then

$$X'X = V \Sigma^2 V'.$$

$(X'X)^{-1}$ exists iff $X$ is full rank (or has strictly positive singular values), in which case

$$(X'X)^{-1} = V \Sigma^{-2} V'.$$

(See an answer I gave to a related question on Math.SE.)

If you already have $\Sigma$ and $V$, calculating $(X'X)^{-1}$ requires inverting and squaring a diagonal matrix ($n$ operations for an $n \times n$ matrix), scaling the columns (or rows) of a matrix ($n^2$ operations), and a single matrix multiply (unfortunately $\mathcal{O}(n^{3})$). This method will be well-behaved numerically.

There are fast methods for obtaining the diagonal elements of the inverse of a sparse matrix (see work by Yousef Saad's group and work by Lin Lin, et al). However, in your case, $X'X$ is probably not sparse (even if $X$ is), and even if it were, it would likely be ill-conditioned enough that these fast methods would yield inaccurate results.

Geoff Oxberry
  • 30,394
  • 9
  • 64
  • 127
  • +1, I forgot about that nice SVD property. If no other answers will come, I'll accept this answer, since it is pretty close to the one I wanted to get (and certainly magnitudes better than one I expected to get :)) – mpiktas Feb 20 '12 at 19:39
  • I should add that if you only want to calculate the diagonal elements of $(X'X)^{-1}$, then the process is $\mathcal{O}(n^2)$ if you already have the SVD of $X$. I can write out a summation formula in my answer if you like. – Geoff Oxberry Feb 22 '12 at 01:21
  • There isn't a reference; it's just expanding out a formula for the diagonal elements of $(X'X)^{-1}$ in terms of element-wise summations. – Geoff Oxberry Feb 22 '12 at 04:04
  • I can work it out no problem. The row sums of V divided by square of diagonal of $\Sigma$. Neat. – mpiktas Feb 22 '12 at 06:42
  • Ignore last comment, there is an error there. I got the correct formula though. – mpiktas Feb 22 '12 at 07:55