I just wonder what is the meaning or intuition behind $\sqrt{n}$ before $(\hat\beta-\beta)$ when we talk about asymptotic normality. Where does $\sqrt{n}$ come from?
$\sqrt{n}(\hat\beta-\beta)\xrightarrow{ d }N(0,\sigma^2)$
I just wonder what is the meaning or intuition behind $\sqrt{n}$ before $(\hat\beta-\beta)$ when we talk about asymptotic normality. Where does $\sqrt{n}$ come from?
$\sqrt{n}(\hat\beta-\beta)\xrightarrow{ d }N(0,\sigma^2)$
It "comes from" the central limit theorem, see What intuitive explanation is there for the central limit theorem?. It turns out that for many, although clearly not all estimators, scaling the estimation error $\hat\beta-\beta$ by $\sqrt{n}$ yields a nondegenerate asymptotic normal distribution.
See root-n consistent estimator, but root-n doesn't converge? for further discussion.
See Estimation of unit-root AR(1) model with OLS for an example of an estimator that does not converge at the $\sqrt{n}$ rate.