1

For the linear regression $y_t = Bx_t+e_t$ where we have the assumptions: $E(e_t)=0$, $E(e_t^2) = \sigma^2$, $E(e_t e_s)= 0$ for $s\neq t $ a ), suppose $x_t = \frac{1}{t}$ for all t. Is $\hat {B}$ unbiased an consistent or not?

b)Suppose $x_t = \frac{1}{2}$ for all t. Show whether $\hat{B}$ is unbiased or not. Show whether $\hat{B}$ is consistent for $\beta$ or not.

For point a I state that the estimator is unbiased because $$E(\hat{B}) = \beta + E\left(\sum_{t=1}^{T} \frac{1}{t}e_t\right) = \beta + \sum_{t=1}^{T} \frac{1}{t}E(e_t) = \beta$$ then I checked if the variance is equal to $0$ to verify consistency and : \begin{align*} Var(\hat{\beta}) &= Var\left(\frac{\sum_{t=1}^{T} \frac{1}{t}e_t}{\sum_{t=1}^{T} \frac{1}{t}^2}\right) \\ &= \frac{1}{\left(\sum_{t=1}^{T} \frac{1}{t^2}^2\right)} \cdot \sum_{t=1}^{T} \left(\frac{1}{t}\right)^2 Var(e_t) \\ &= \sigma^2 \cdot \frac{1}{\sum_{t=1}^{T} \frac{1}{t}^2} \end{align*}

Since the denominator is finite then the variance does not converge to $0$ because $\sum_{t=1}^T \frac{1}{t^2}\lt \infty$ then the $\hat{B}$ is not consistent Is my attempt correct?

for point B I state that the estimator is unbiased for the same reason of point. How can I show if the $\hat{B}$ is consistent or not in this case?

Any help would be really appreciated

WNZ
  • 21
  • What, exactly, is your definition of a "consistent" estimator? The setting of (a) does not conform with most of the definitions I have seen: there is no sequence of independent samples of a distribution in evidence. As far as (b) goes, that looks easier than (a) (but beware the typographical errors in your account of (a) -- maybe they are more than typographical?). What is the obstacle to performing the same kind of analysis? – whuber Jul 05 '23 at 19:31
  • for consistency I mean $\hat{B}$ goes in probability to $\beta$. I do not know how to prove this. I tried to demonstrate what happens when T goes to infinity for variance showing mean-square error convergence because I think should also imply convergence in probability, maybe this is not the correct approach – WNZ Jul 06 '23 at 07:18
  • Consider comparing $(1/1)e_1,$ with a variance of $\sigma^2,$ to $\sum_{t=2}^T (1/t)e_t,$ with a variance not exceeding $(\zeta(2)-1)\sigma^2\approx 0.645\sigma^2.$ What does this reveal about the distribution of $\hat\beta$? – whuber Jul 06 '23 at 13:20

1 Answers1

1

Since there are situations where an estimator satisfies the criterion for asymptotic consistency, but still has a non-zero variance at the limit (look around this site to find relevant discussions, e.g. Asymptotic consistency with non-zero asymptotic variance - what does it represent?),

one would want to examine directly the probability at the heart of the asymptotic consistency criterion.

For $\hat \beta_T$ to be consistent we would want

$$\Pr\left(|\hat \beta_T - \beta| > \epsilon\right) \to 0, \;\;\; \forall \epsilon >0.$$

Now,

$$\Pr\left(|\hat \beta_T - \beta| > \epsilon\right) = \Pr\left(\frac{\left|\sum_{t=1}^{T} \frac{1}{t}e_t\right|}{\sum_{t=1}^{T} \frac{1}{t^2}} > \epsilon\right)$$

and because, indeed, the denominator converges (to $\pi^2/6$) as $T\to \infty$, the issue is what happens with $$\left|\sum_{t=1}^{T} \frac{1}{t}e_t \right|\;\;\; {\rm as} \;\;\; T\to \infty.$$

We would want it to go to zero. Because $$ \sum_{t=1}^{T} \frac{1}{t}e_t \leq \left|\sum_{t=1}^{T} \frac{1}{t}e_t \right|,$$

we can ignore the absolute value. Now, assuming that the $e_t$ random variables are independent, and since $1/t$ is deterministic and $E(e_t) = 0$ we have,

$$E\left(\sum_{t=1}^{T} \frac{1}{t}e_t\right) = 0,\;\;\; {\rm Var}\left(\sum_{t=1}^{T} \frac{1}{t}e_t\right) = {\rm Var}(e_t)\sum_{t=1}^{T} \frac{1}{t^2} \to \frac{\pi^2}{6}{\rm Var}(e_t).$$

It appears that $\sum_{t=1}^{T} (e_t/t)$ converges to a regular random variable, meaning that its absolute value does not go to zero. So the probability criterion for asymptotic consistency for $\hat \beta_T$ is not satisfied.