A common problem in statistics is computing the square root inverse of a symmetric positive definite matrix. What would be the most efficient way of computing this?
I came across some literature (which I haven't read yet), and some incidental R code here, which I will reproduce here for convenience
# function to compute the inverse square root of a matrix
fnMatSqrtInverse = function(mA) {
ei = eigen(mA)
d = ei$values
d = (d+abs(d))/2
d2 = 1/sqrt(d)
d2[d == 0] = 0
return(ei$vectors %*% diag(d2) %*% t(ei$vectors))
}
I am not entirely sure I understand the line d = (d+abs(d))/2. Is there a more efficient way of computing the matrix square root inverse? The R eigen function calls LAPACK.
Do you need all the entries of the matrix $A^{-1/2}$, or is it sufficient to be able to multiply $A^{-1/2}$ by an arbitrary vector $x$?
– Daniel Shapero Dec 21 '13 at 22:06d[d<0] = 0, which is more expressive. – Federico Poloni Apr 14 '14 at 07:25