10

I've encountered a problem question:

  • The probability of success for a random variable follows a Beta(5, 3) distribution.
  • The posterior mean is θ = 0.625.
  • The odds of success is defined as θ / (1 - θ).

Simulate a large number of samples from the posterior distribution for θ and use these samples to approximate the posterior mean for the odds of success Ε(θ / (1 - θ)).

Initially I thought that the answer should be very close to 0.625 / (1 - 0.625) = 5 / 3 (approx. 1.66). However, after performing this simulation with 1e5 data points, I get an answer of approx 2.5, which is the correct answer. Why is it that my initial thought was wrong? That is, why is E(θ / (1 - θ)) different than E(θ) / (1 - E(θ)) ?

  • 5
    See Jensen's Inequality. To appreciate its application here, suppose $\theta$ has a $1/3$ chance to be $2$ and a $2/3$ chance to be $-1.$ Then $E[\theta/(1-\theta)]$ is easy to calculate and finite, while $E\theta$ is undefined because $E[\theta]=0.$ A near duplicate question was answered at https://stats.stackexchange.com/questions/139503. It applies because $\theta/(1-\theta)=1-1/(1-\theta)$ and that thread addresses the $1/(1-\theta)$ term. – whuber Jan 12 '24 at 13:25

2 Answers2

11

Taking an expectation does not commute with all arithmetic operations. While $E(X+Y)=EX+EY$, such a "distributivity" does not hold for other operations. For instance, the expectation of a product of random variables is in general not equal to the product of the expectations, nor would this hold for ratios.

The easiest way to see this is to go back to the definition of the expectation as an integral, and to remember from our calculus classes that integrals don't commute with multiplication and division of integrands, either.

Stephan Kolassa
  • 123,354
  • 1
    Indeed! Or in the case of discrete random variables the expectation is a sum, and the ratio of sums is not the same as the sum of ratios. Or as we say less kindly in German "Aus Summen kürzen nur die Dummen" ("only the stupid cancel from sums"). – Christoph Hanck Jan 12 '24 at 13:54
  • 1
    @ChristophHanck: one déformation professionelle of my pretending to be a statistician is that by now, a sum to me feels like a simple case of an integral... – Stephan Kolassa Jan 12 '24 at 14:23
3

Let's consider a uniformly distributed random variable: $\theta \sim U(-\frac{1}{2}, \frac{1}{2})$. Then evidently,

$$E[\theta] = \int_{-\infty}^{\infty} \theta \, p(\theta) \, d\theta = \int_{-\frac{1}{2}}^{\frac{1}{2}} \theta \, d\theta = 0,$$

and thus,

$$\frac{E[\theta]}{1 - E[\theta]} = 0.$$

On the other hand,

$$E\left[\frac{\theta}{1 - \theta}\right] = \int_{-\infty}^{\infty} \frac{\theta}{1 - \theta} \, p(\theta) d\theta = \int_{-\frac{1}{2}}^{\frac{1}{2}} \frac{\theta}{1 - \theta} \,d\theta = \bigg[ -\theta - \ln |1-\theta| \bigg]_{-\frac{1}{2}}^{\frac{1}{2}} = \ln 3 - 1.$$

And so for this example,

$$\frac{E[\theta]}{1 - E[\theta]} \neq E\left[\frac{\theta}{1 - \theta}\right].$$


Similar derivations for $\theta \sim \text{Beta}(5, 3)$ are an exercise left to the reader.

Mateen Ulhaq
  • 149
  • 7