2

Let $\mu$ be the mean and $\sigma$ the standard deviation of a probability distribution defined on the bounded interval $[a,b]$ (that is, the probability that the random variable lies outside $[a,b]$ is zero).

Does the following inequality hold generally for any such probability distribution?

$$\sigma^2 \le (\mu-a)(b-\mu)$$

Motivation: This inequality holds for the Beta distribution (as is obvious from the formulae here).

a06e
  • 4,410
  • 1
  • 22
  • 50

0 Answers0