11

For a nonnegative random variable $X$, how to prove that $E(X^n)^{\frac1n}$ is non-decreasing in $n? $

User1865345
  • 8,202

1 Answers1

10

Write $p$ in place of $n$ to emphasize it can be any positive real number, rather than just an integer as suggested by "$n$".

Let's go through some standard preliminary transformations to simplify subsequent calculations. It makes no difference to the result to rescale $X$. The result is trivial if $X$ is almost everywhere zero, so assume $\mathbb{E}(X)$ is nonzero, whence $\mathbb{E}(X^p)$ also is nonzero for all $p$. Now fix $p$ and divide $X$ by $\mathbb{E}(X^p)^{1/p}$ so that $$\mathbb{E}(X^p) = 1\tag{1},$$ with no loss of generality.

Here's how the reasoning might proceed when you're trying to figure it out the first time and you're trying not to work too hard. I will leave detailed justifications of each step to you.

The expression $\mathbb{E}(X^p)^{1/p}$ is nondecreasing if and only if its logarithm is nondecreasing. That log is differentiable and therefore is nondecreasing if and only if its derivative is non-negative. Exploiting $(1)$ we may compute (by differentiating within the expectation) this derivative as

$$\frac{d}{dp}\log\left( \mathbb{E}(X^p)^{1/p} \right) = -\frac{1}{p^2}\log\mathbb{E}(X^p) + \frac{\mathbb{E}(X^p \log X)}{\mathbb{E}(X^p)} = \frac{1}{p}\mathbb{E}(X^p \log(X^p)).$$

Writing $Y=X^p$, the right hand side is non-negative if and only if $$\mathbb{E}(Y\log(Y)) \ge 0.$$ But this is an immediate consequence of Jensen's Inequality applied to the function $f(y) = y\log(y)$ (continuous on the nonnegative reals and differentiable on the positive reals), because differentiating twice shows $$f^{\prime\prime}(y) = \frac{1}{y}\gt 0$$ for $y\gt 0$, whence $f$ is a convex function on the non-negative reals, yielding

$$\mathbb{E}(Y \log Y) = \mathbb{E}(f(Y)) \ge f\left(\mathbb{E}(Y)\right) = f(1) = 0,$$

QED.


Edit

Edward Nelson provides a wonderfully succinct demonstration. As a matter of (standard) notation, define $||x||_p = \mathbb{E}(|x|^p)^{1/p}$ for $1 \lt p \lt \infty$ (and $||x||_\infty = \sup |x|$). Upon observing that the function $f(x) = |x|^p$ is convex, he applies Jensen's Inequality to conclude

$$|\mathbb{E}(x)|^p \le \mathbb{E}(|x|^p).$$

Here is the rest of the demonstration in his own words:

Applied to $|x|$ this gives $$||x||_1 \le ||x||_p,$$ and applied to $|x|^r$, where $1 \le r \lt \infty$, this gives $$||x||_r \le ||x||_{rp},$$ so that $||x||_p$ is an increasing function of $p$ for $1 \le p \le \infty$.

Reference

Edward Nelson, Radically Elementary Probability Theory. Princeton University Press (1987): p. 5.

whuber
  • 322,774