2

Assume we have a true model of

$$Y=X\beta+\varepsilon,$$

where $Y$ is some outcome , $X$ is a $1\times p$ vector of covariates which have a (non-diagonal) variance-covariance matrix $\Omega$, $\beta$ is a $p\times 1$ vector of parameters, and $\varepsilon$ is the residual which is independent to $X$. We further assume that the outcome $Y$ and each of the covariates $X_i$ are mean-zero and variance-one.

I have the results of $p$ simple regressions of the form

$$Y=X_i b_i + e_i$$

where $X_i$ is the $i$-th element of $X$ and $b_i$ is the associated parameter of the simple regression model. (Note that in this case, the OLS estimate of $b_i$, which we define as $\hat{b}_i$ will just be the sample covariance of $X_i$ and $Y$.)

I am interested in

$$Cov(\hat{b}_i, \hat{b}_j)$$

for any $i$ and $j$. Note that this is different than the covariance of $\hat{\beta}$ (i.e. the estimates of a multiple regression model controlling for all covariates simulataneously).

Is there an easy way to derive this?

Patrick
  • 21

0 Answers0