6

For any nonnegative random variable $X$ independent of $U$ where $U \sim \operatorname{Uniform}(-t,t)$ and any $t\ge 0,$

$$P(X+U\ge t)\le\frac{E(X)}{2t}.$$

Any hints to prove this inequality?

whuber
  • 322,774
  • What have you tried so far, and where are you stuck? – Arya McCarthy Apr 09 '22 at 01:37
  • 1
    I started from$E(X)=\int_0^{\infty}P(X>x)dx>=\int_0^{2t}P(X>x)dx$ and tried to put $P(X+U+t>=2t)$ in the integral. Not sure if I'm on the right track. I feel like the independence is also useful but I don't know how to use it. – user10386405 Apr 09 '22 at 01:39

2 Answers2

7

Yes, your comment is on the right track. This looks like it's going to be Markov's Inequality: $$P(Z\geq z)\leq \frac{E[Z]}{z}$$ for non-negative $Z$. As you have noted, $X+U+t$ is non-negative, so it's a candidate for $Z$, and $X+U+t\geq 2t$ when $X+U\geq t$. So, consider the mean of $Z$.

Thomas Lumley
  • 38,062
3

It's convenient to define $U$ once and for all to be a uniform variable on the interval $[-1,1]$ and simply multiply it by $t$ to produce the $U$ used in the question.

Two useful, easily proven, but not widely known facts about random variables $X$ in general are

  1. No matter what the distribution function (CDF) $F_X$ of $X$ might be, the random variable $X + tU$ is absolutely continuous with a density function $$f_{X+tU}(y) = \frac{1}{2t}\left(F_X(y+t) - F_X(y-t)\right).$$ There are many ways to prove this, as explained at https://stats.stackexchange.com/a/43075/919 (which concerns sums of the closely related uniform variable supported on $[0,1]$).

  2. The expectation of any non-negative random variable $X$ with CDF $F_X$ equals $$E[X] = \int_0^\infty \left[1 - F_X(x)\right]\,\mathrm{d} x.$$ This is repeatedly demonstrated in many threads here on CV; one is Expectation of a function of a random variable from CDF. They all amount to performing an integration by parts.

With these facts in mind, evaluate the probability in the question as

$$\begin{aligned} \Pr(X + tU \gt t) &= \int_t^\infty f_{X+tU}(y)\,\mathrm{d}y \\ &= \frac{1}{2t}\int_t^\infty \left(F_X(y+t) - F_X(y-t)\right)\,\mathrm{d}y\\ &= \frac{1}{2t}\int_t^\infty \left( \left[1 - F_X(y-t)\right] - \left[1 - F_X(y+t)\right]\right)\,\mathrm{d}y\\ &= \frac{1}{2t}\left(\int_0^\infty \left[1 - F_X(x)\right]\,\mathrm{d}x - \int_{2t}^\infty \left[1 - F_X(x)\right]\,\mathrm{d}x\right)\\ &= \frac{1}{2t}\left(E[X] - \int_{2t}^\infty \left[1 - F_X(x)\right]\,\mathrm{d}x\right)\\ &\le \frac{1}{2t} E[X]. \end{aligned}$$

The justifications of these steps are (1) definition of density, (2) fact $(1)$ above, (3) algebra, (4) linearity of integration followed by changes of variables, (5) fact $(2)$ above, and (6) since $1-F_X(x)$ is a probability for all $x,$ it is never negative, whence its integral is non-negative.

Finally, $\Pr(X + tU \ge t) = \Pr(X + tU \gt t)$ because (as previously noted) the variable $X + tU$ is absolutely continuous, QED.


One thing I like about this way of proceeding is the insight it provides into the tightness of the inequality: the amount by which the probability (on the left hand side) falls short of $E[X]/(2t)$ (on the right hand side) is proportional to the integral of $1-F_X$ from $2t$ on up. Thus, for instance, when $X$ is bounded and $2t$ exceeds this bound, that integral is zero and the inequality becomes an equality.

whuber
  • 322,774