0

Classical least squares results in regression in statistics state that if $(Y, X)$ follow a model where $$\mathbb{E}[Y\mid X=x] = \alpha + \beta x,$$ we can estimate $\beta$ from a random sample using LS explicitly as the second component of $\hat{\beta} = (\textbf{X}^t\textbf{X})^{-1}\textbf{X}^t\textbf{Y}$, where $\textbf{X}$ is the matrix of ones and $X_i$.

My question is the following: Let $(Y, X, Z)$ follow a model $$\mathbb{E}[Y\mid Z=z, X=x] = (\alpha x) + (\beta x) z.$$

Consider a random sample from $(Y, X, Z)$. Can we estimate $\hat{\beta}$ and give an explicit expression for it?

  • 1
    It is challenging to search this site for a formula -- but in this case, because the formula appears in over ten thousand posts, it's not hard to find! This restricted search worked for me. You might also be interested in https://stats.stackexchange.com/questions/148638. – whuber Jan 31 '24 at 14:47
  • I don't think it's fair to call this problem a duplicate. I think OP is probably a little unsure how to work two parameters $X,Z$ into this model. What OP needs to realise is $Y=\alpha X+\beta XZ$ is still linear in terms of the parameters $\theta=(\alpha, \beta)$. The formula of $\hat{\theta}=(\mathbf{X}^T\mathbf{X})^{-1}\mathbf{X}^T\mathbf{Y}$ can still be used, just with $x_i$ and $x_iz_i$ as the columns of $\mathbf{X}$. – cambridgecircus Jan 31 '24 at 15:01

0 Answers0