3

Let $X$ be a symmetric random variable with bounded moments and standard deviation $\sigma$. I want to lower-bound $\mathbb E[|X|]$ in terms of $\sigma$. Here is the formal conjecture; I wonder if this is true or could be refuted:

There exists a global constant $C$ such that for every symmetric r.v. $X$ with bounded moments and standard deviation $\sigma$, it holds that $\mathbb E[|X|] \geq C \sigma$.

I suspect that this is a straightforward result, but I could not find anything about it nor could I prove it myself. Any ideas?

Edit: Symmetry w.r.t. zero, namely $f(x)=f(-x)$ for all $x$.


Some examples:

  1. If $X$ has Rademacher distribution (-1 w.p. 0.5 and 1 w.p. 0.5), then $\mathbb E[|X|]=1$ and $Var(X)=1=\sigma^2$; therefore, $\mathbb E[|X|] = 1\cdot \sigma$ (the above holds with $C=1$).
  2. For $X\sim Uniform(-b,b)$ for some $b>0$, $Var(X)=\frac{b^2}{3}=\sigma^2$ and $\mathbb E[|X|]=\frac{b}{2}$; thus, the above holds for $C=\frac{\sqrt{3}}{2}\approx 0.866$.
  3. For $X\sim N(0,\sigma^2)$, $|X|$ is half-normal and $\mathbb E[|X|]=\sigma \sqrt{\frac{2}{\pi}}$; hence, the above holds for $C=\sqrt{\frac{2}{\pi}}\approx 0.797$.
AvidLearner
  • 238
  • 2
  • 13
  • Could you please elaborate on your meaning of "symmetric"? For instance, many (if not most) people consider all Normal distributions symmetric, but because all your examples are symmetric about $0,$ I wonder whether that's part of your definition, too. The resolution of your question for distributions symmetric about $0$ follows easily from the power-mean inequality. – whuber Jun 01 '23 at 19:37
  • Regardless $X$ is symmetric or not, you can always simply take $C = E[|X|]/\sigma > 0$ to make your conjecture hold (or $ C = 0.5E[|X|]/\sigma$ if you want a strict inequality). So this "inequality" is actually quite trivial. Your examples also showed the way how you determined $C$ followed this route. – Zhanxiong Jun 01 '23 at 19:46
  • @whuber Thanks, I should have said that specifically. I assume that $f(x)=f(-x)$ for all $x$. Could you sketch the proof? I can't see how it follows from the power-mean inequality. – AvidLearner Jun 01 '23 at 19:54
  • @Zhanxiong But who promises us that $C$ won't be arbitrarily small for some distributions? Note the quantifier order: I want a value $C$ that satisfies the above for all symmetric r.v. – AvidLearner Jun 01 '23 at 19:57
  • @Zhanxiong Those examples demonstrate that if the statement holds, then the value of the global constant $C$ is lower-bounded by $C \geq 0.797$. – AvidLearner Jun 01 '23 at 20:03
  • 2
    @AvidLearner OK, thanks for the clarification. That makes it a more interesting question. – Zhanxiong Jun 01 '23 at 20:16
  • 1
    The power-mean inequality shows you have the inequality in the wrong direction! – whuber Jun 01 '23 at 21:19
  • @whuber The other side is straightforward to obtain with Jensen's inequality and $C=1$. I wondered whether it could be sandwiched a la Khintchine inequality. Apparently the answe is no. – AvidLearner Jun 02 '23 at 04:06

1 Answers1

7

You can actually develop your first example a little more to refute this conjecture, that is put some mass at $0$.

Let $X$ be the random variable such that $P[X = 0] = 1 - p$, $P[X = -1] = P[X = 1] = \frac{1}{2}p$, where $p \in (0, 1)$. It is then easy to verify that $E[|X|] = p$, $\sigma = \sqrt{E[X^2]} = \sqrt{p}$, whence $\frac{E[|X|]}{\sigma} = \sqrt{p}$. So if your conjecture held, the global constant $C$ must be bounded above by $\sqrt{p}$, which can be arbitrarily small. This requires that $C$ cannot be positive but $0$.

Zhanxiong
  • 18,524
  • 1
  • 40
  • 73