0

The original problem is: Given $Y_i = \beta X_i + \epsilon_i$, $i=1,2,...,n$, where $X \sim N(\mu, \tau^2)$ iid and $\epsilon \sim N(0, \sigma^2)$ iid, $X$ and $\epsilon$ are independent. What is the expectation and variance of $\frac{\sum Y_i}{\sum X_i}$

$\sum Y_i \sim N(n\beta \mu, n(\beta^2 \tau^2 + \sigma^2))$

$\sum X_i \sim N(n\mu, n\tau^2)$

$\sum \epsilon_i \sim N(0, n\sigma^2)$

I first found from the fact that $E(\epsilon_i) = 0$ that $$ E\left(\frac{\sum Y_i}{\sum X_i}\right) = E\left(\frac{\sum (\beta X_i + \epsilon_i)}{\sum X_i}\right) = E\left(\frac{\beta \sum X_i + \sum \epsilon_i}{\sum X_i}\right) = E\left(\beta + \frac{\sum \epsilon_i}{\sum X_i}\right) = \beta + E\left(\frac{\sum \epsilon_i}{\sum X_i}\right) = \beta + E(\Sigma \epsilon_i)\cdot E\left(\frac{1}{\Sigma X_i}\right) = \beta $$ And I tried a similar method for Variance to get $$ Var\left(\frac{\sum Y_i}{\sum X_i}\right) = Var\left(\beta + \frac{\sum \epsilon_i}{\sum X_i}\right) = Var\left(\frac{\sum \epsilon_i}{\sum X_i}\right) $$ but haven't been able to manipulate it further to find an answer. I tried using the Law of Total Variance to find $$ Var\left(\frac{\sum \epsilon_i}{\sum X_i}\right) = E\left(\frac{1}{\Sigma X_i}\right)^2Var\left(\Sigma \epsilon_i\right) + Var\left(\frac{1}{\Sigma X_i}\right)E\left(\Sigma \epsilon^2\right) $$ And then I tried to find the distribution for $1/X$ by letting $Z = \frac{1}{\sum X_i}$ and finding the transformation of $f_X$ but I couldn't find a way to integrate it. Does anyone have any advice for where I can go from here?

ADAM
  • 721
Kevin
  • 1
  • 1
  • 1
    The ratio of two independent (zero-mean) normal random variables is a Cauchy random variable which doesn't have an expected value at all, and so I would be very careful about your calculations. – Dilip Sarwate Feb 29 '24 at 22:16
  • This may be relevant: https://stats.stackexchange.com/questions/70045/mean-and-variance-of-the-reciprocal-of-a-random-variable – jbowman Feb 29 '24 at 22:23
  • If the original problem is not intended as a trick question, then perhaps it was asking for the variance of the ratio conditional on the $X_i.$ – whuber Feb 29 '24 at 22:37
  • 1
    @whuber That might've been it. There's an almost identical problem in Casella and Berger's Statistical Inference (our course book) chapter 7 #20 except the $x_i$ are fixed constants where it wouldn't need to be specified that the expectation is conditional on the $x_i$. If the problem was copied over quickly, then perhaps it was forgotten to specify the conditional. Or perhaps I just misread that part. – Kevin Feb 29 '24 at 23:00

0 Answers0