6

If there is a linear regression model as follows:

$$y = \beta_0 + \beta_1x_1 + \beta_2x_2 + \beta_3x_3 + u$$

and we want to estimate the ratio of the slope coefficients:

$$\theta = \frac{\beta_1}{\beta_2}$$

Would the following estimator be biased for $\theta$?

$$\hat{\theta} = \frac{\hat\beta_1}{\hat\beta_2}$$

We know that given that the usual assumptions for a linear regression model are satisfied, then the slope estimators should be unbiased. Therefore, if we take the ratio of two unbiased estimators, would the resulting estimator also be unbiased?

  • 1
    No, see https://en.wikipedia.org/wiki/Jensen%27s_inequality, which implies that $E(1/X)\neq 1/E(X)$. – Christoph Hanck Jan 31 '19 at 16:36
  • @Xi'an How are the slope estimates not independent? I can't see how they would be positively or negatively correlated with each other. – Robin Liao Jan 31 '19 at 23:44
  • 3
    It is well-known (you'll also find plenty of hits on this site) that $Var(\hat\beta)=\sigma^2(X'X)^{-1}$, which, in general, is not a diagonal matrix. – Christoph Hanck Feb 01 '19 at 05:36

1 Answers1

4

No, it will not be unbiased (unless the estimator of the denominator have zero variance.) And it will not help if the numerator and denominator are independent. In general, if $\hat{\theta}$ is an unbiased estimator of $\theta$ and $g$ is some nonlinear function, it would be a rare case that $g(\hat{\theta})$ is unbiased estimator for $g(\theta)$.

There is more information in this related post: Test Statistic for a ratio of regression coefficients?.