0

I am following computation statistics by givens and after looking at the supplementary code, I find that:

$$I''(\beta) = -Z^TWZ$$ Where Z are the column vectors of the covariates, and W is the diagonal matrix with ith diagonal entry $\pi_i(1-\pi_i)$

The code is provided as the following:

hessian = matrix(0,2,2)
hessian[1,1] = -sum(w)
hessian[1,2] = hessian[2,1] = -t(w)%*%z[,2]
hessian[2,2] = -t(w)%*%(z[,2]^2)

Reference to authors text for code: computational statistics. I cannot understand where the author gets the values for hessian[1, 2] and hessian[2,2]

Emil11
  • 23
  • Evidently the first covariate is the intercept term where the column of $Z$ is all ones. – whuber Nov 19 '22 at 17:16
  • @whuber Sure, I have got the part already understood. However, I was interested in knowing how the author decided on the equations for the hessian. I had assumed that hessian[1,1] and hessian[2,2] were going to be equal. I do not understand why hessian[2,2] the Z is squared. – Emil11 Nov 19 '22 at 18:56
  • I think perhaps you haven't understood that, because it explains everything. In effect, z[, 1] disappears because it always equals $1.$ The rest follows from the rules of matrix multiplication. I'm making some guesses here because the appearance of t(w) for an allegedly diagonal matrix is strange due to the superfluous call to t. I am supposing w is a an $n\times 1$ column matrix of the diagonal entries, not a square diagonal matrix. – whuber Nov 19 '22 at 19:46

0 Answers0