1

Here are two captures from different videos on youtube, one obtains $Var(\hat{\beta_1})$, the other obtains $Var(\hat{\beta_1}|X)$ but they yields the same equations when the second capture further comments that $Var(\hat{\beta_1}|X)\neq Var(\hat{\beta_1})$

enter image description here

enter image description here

capture1 vedio

capture2 vedio

LJNG
  • 331

1 Answers1

0

In both videos, the speakers caution that $X$ is assumed fixed. When $X$ is assumed fixed, this means that there is no variation in $X$ and so $Var(\hat{\beta_1} | X) = Var(\hat{\beta_1})$. So in short, the answer to your question is "yes."

When the term "simple linear regression" is used, that assumption is normally invoked (whether implicitly or explicitly). The Wikipedia article on OLS provides a decent summary of this assumption: https://en.wikipedia.org/wiki/Ordinary_least_squares#Assumptions

bschneidr
  • 442
  • Thank you for your reply. What about the second video also assert that$Var(\hat{\beta_1} | X) \neq Var(\hat{\beta_1})$? If $X$ is random, what is the $Var({\hat{\beta_1}})$? – LJNG Jun 07 '22 at 01:00