Are there specific conditions under which the following is true? E.g., certain distributions, positive RV?
$$\mathbb{E}\left[\dfrac{1}{\sum X_{i}}\right] = \dfrac{1}{\mathbb{E}\left[\sum X_{i}\right]}$$
Are there specific conditions under which the following is true? E.g., certain distributions, positive RV?
$$\mathbb{E}\left[\dfrac{1}{\sum X_{i}}\right] = \dfrac{1}{\mathbb{E}\left[\sum X_{i}\right]}$$
I worked through some equations, with the original intention of showing that it is in fact true, but finally convincing myself that it might not be. In case it's useful, here is the working that convinced me that it might not be generally true:
Initially, I thought that expectation being a linear operator was going to make it easy to prove truth. So I wrote down:
$$ \def\Exp{\mathbb{E}} \Exp[A + B] = \Exp[A] + \Exp[B] $$
Then I called your lhs expression $E_1$. So we have:
$$ E_1 = \Exp\left[ \frac{1} {\sum X_i} \right] $$
In order to go further, I felt we need to form the expectation over a distribution, so let's say that we have $\def\X{\mathbf{X}}\X \sim g$, where $\X = \{X_1, X_2, \dots, X_n \}$. So, we have:
$$ E_1 = \Exp_{\X \sim g}\left[ \frac{1}{\sum X_i} \right] $$
We can expand the expectation as an integration (discrete or continuous, lets take the continuous case):
$$ E_1 = \int_\X p(\X) \frac{1}{\sum X_i} \, d\X \\ = \int_\X \frac{p(\X)}{\sum X_i} \, d\X $$
It's at this point I thought, ok, there seems no obvious way of just multiplying the equation by $\sum X_i$ or similar. Which doesnt prove that we cant do something similar, but combined with Michael Chernick's assertions, I guess it at least convinces me his assertions seem not unreasonable to me.
First note that the random variable $X_i$ is not defined on 0. So assume, for now that $X_i$s are defined on the positive reals. On the positive reals, the function $f(x) = 1/x$ is a convex function. Thus, using Jensen's inequality,
$$E \left[\dfrac{1}{\sum X_i} \right] \geq \dfrac{1}{E\left[ \sum X_i \right]} = \dfrac{1}{\sum E\left[X_i\right]} \,.$$
Now the equality in Jensen's inequality holds if either $f$ is affine (which in this case $f(x) = 1/x$ is not) or if $\sum X_i$ is a degenerate random variable, i.e., a constant.
Using concavity, a similar argument can be used when $X_i$s are all defined on the negatives.