5

It is well known that if a random variable $X$ has distribution: $$ \mathrm{P}(X = x) = \begin{cases} \frac{1}{2}, & x=0,\\ \frac{1}{2}, & x=1,\\ 0, & \text{otherwise}, \end{cases} $$ (i.e., it is Bernoulli-distributed with probability of success $\tfrac{1}{2}$), it saturates Chebyshev's inequality for $k=1$: $$ \mathrm{P}(|X - \mathrm{E}[X]| \leq \sqrt{\mathrm{Var}[X]}) = \mathrm{P}(|X - \tfrac{1}{2}| \leq \tfrac{1}{2}) = 1. $$

Using Chebyshev's inequality, is it possible to show the following statement?

If $X$ is a random variable with $0 \leq X \leq 1$, $\mathrm{E}[X] =\tfrac{1}{2}$, and $\mathrm{Var}[X] = \tfrac{1}{4}$, then $X$ is Bernoulli-distributed with probability of success $\tfrac{1}{2}$.

Thanks!

Emmy B
  • 93
  • You could say that $\mathrm{P}(|X - \mathrm{E}[X]| \leq \sqrt{\mathrm{Var}[X]}) = 1$ if and only if $X$ is a Bernoulli is scaled and shifter Bernoulli variable. – Sextus Empiricus Apr 10 '23 at 16:55
  • Your Bernoulli example has support ${0,1}$ rather than $[0,1]$ – Henry Apr 11 '23 at 00:32
  • "If $X$ is a random variable with $0 \leq X \leq 1$" this condition is unnecessary for reversing the logic of the inequality. Your point is about the condition "if $\mathrm{P}(|X - \mathrm{E}[X]| \leq \sqrt{\mathrm{Var}[X]}) = 1$". – Sextus Empiricus Apr 11 '23 at 06:25
  • @SextusEmpiricus While I see your point (to make the statement becomes a perfect "if and only if" one), saying "$0 \leq X \leq 1$" is "unnecessary" seems inaccurate. By saying that, it means it can be dropped and the conclusion of the problem still holds. However, what you suggested is just replace this original condition with another condition (which has closer relation to Chebyshev's inequality). As I explained in my answer, the original problem in fact has nothing to do with Chebyshev's inequality. – Zhanxiong Apr 11 '23 at 13:29
  • My interpretation is, OP was just interested in if the C-inequality can be used to prove the problem as written, instead of trying to prove a perfect "if and only if" proposition. But the setup is indeed somewhat misleading. – Zhanxiong Apr 11 '23 at 13:31
  • This result follows from examining any of the proofs of Popoviciu's Inequality. – whuber Nov 21 '23 at 15:48

2 Answers2

11

I don't think Chebyshev's inequality helps in proving this reversed problem (Chebyshev's inequality only tells you, with the given condition, that $P[|X - 1/2| > \epsilon] \leq \frac{1}{4\epsilon^2}$. When $\epsilon \leq 1/2$, this is weaker than the trivial statement that $P[|X - 1/2| > \epsilon] \leq 1$. When $\epsilon > 1/2$, this amounts to say that the probability of $X > 1/2 + \epsilon$ or $X < 1/2 - \epsilon$ is less than $\frac{1}{4\epsilon^2} < 1$, but this is already implied by the other condition "$0 \leq X \leq 1$". Hence the Chebyshev's inequality does not provide any additional insights in proving the goal.), but may be proved as follows (the key of this proof is that if $E[Y] = 0$ and $Y$ is nonnegative, then $Y = 0$ with probability $1$):

The condition $E[X] = 1/2$ and $\operatorname{Var}(X) = 1/4$ implies that $E[X^2] = 1/4 + (1/2)^2 = 1/2$. Therefore, $E[X - X^2] = 0$. By the condition $0 \leq X \leq 1$, the random variable $X - X^2$ is nonnegative, hence $E[X - X^2] = 0$ implies that $X - X^2 = 0$ with probability $1$, i.e., $P[X = 0] + P[X = 1] = 1$. This shows that $X$ must be a binary discrete random variable, and if $P[X = 0] = p$, then $P[X = 1] = 1 - p$. Hence $E[X] = 1/2 = 1 - p$ gives $p = 1/2$. In other words, $X \sim \text{Bernoulli}(1, 1/2)$.

Zhanxiong
  • 18,524
  • 1
  • 40
  • 73
  • Thank you! I suspected that Chebyshev's inequality would only say something as strong as $\mathrm{P}(0\leq X \leq 1) \leq 1$ but was not sure. I appreciate your other approach as well. – Emmy B Apr 10 '23 at 04:00
0

Yes, the logic of the Chebyshev's inequality can be reversed. You could say that $$\mathrm{P}(|X - \mathrm{E}[X]| \leq \sqrt{\mathrm{Var}[X]}) = 1$$

if and only if $X$ is a Bernoulli variable with parameter $p = 0.5$ shifted and scaled to match the specific mean and variance.


"If $X$ is a random variable with $0 \leq X \leq 1$" this condition is unnecessary for reversing the logic of the inequality. The mean and variance, along with the condition that the Chebyshev's inequality is an equality is enough.


Proof, consider the quantile function of $Y = \frac{X-\mu}{\sigma}$. It must be constrained between -1 and +1, have mean 0, and at the same time the square has to integrate to 1.