16

The $r$-th moment of a random variable $X$ is finite if $$ \mathbb E(|X^r|)< \infty $$

I am trying to show that for any positive integer $s<r$, then the $s$-th moment $\mathbb E[|X^s|]$ is also finite.

cardinal
  • 26,862
nona
  • 161
  • Is this homework? If so, what have you tried so far? Also, I've tried to make your question more readable, please let me know if I've made a mistake. – Gschneider Apr 14 '12 at 15:07
  • I read billingsley textbook and searched internet but no exact proof exists. What I found is just a clue maybe jensen's inequality can be used. – nona Apr 14 '12 at 15:09
  • 1
    Consider rewriting $|X^r|$ as $|X^s \cdot X^{r-s}|$ and see if that gets you anywhere. – Gschneider Apr 14 '12 at 15:16
  • 3
    There is a difference between a moment existing and being finite. In particular, a moment can exist, but be infinite. The terminology you're being introduced to is a bit imprecise. In any event, this is a standard result about $L_p$ spaces; it is not true that "no exact proof exists". :) – cardinal Apr 14 '12 at 15:16

2 Answers2

28

$0<s<r \Longrightarrow \forall X \, |X|^s \le \max(1, |X|^r) $

StasK
  • 31,547
  • 2
  • 92
  • 179
  • Fine. You can also prove it with the help of Jensen's inequality. – Stéphane Laurent Apr 14 '12 at 16:39
  • 16
    (+1) I like this because it relies on only the most basic properties of expectation, namely monotonicity. In case one is worried about what to do with the right-hand side, they can note that $\max(1,|X|^r) \leq 1 + |X|^r$. If one prefers an application of Jensen, they can write $|X|^r = (|X|^s)^{r/s}$ and note that $r/s \geq 1$. – cardinal Apr 14 '12 at 16:54
  • 1
    @cardinal: (+1) I prefer your inequality as it directly involves $|X|^r$... – Xi'an Apr 15 '12 at 07:21
4

You can prove it with the help of the monotonicity property of integration,i.e., for functions $f$ and $g$ such that $f \geqslant g$, then $\mathbb{E}[f(X)] \geqslant \mathbb{E}[g(X)]$.

Proof:

Consider the function $g:\mathbb{R} \to \mathbb{R}$ such that $g(x) = |x|^r + 1$ and the function $h: \mathbb{R} \to \mathbb{R}$ such that $h(x) = |x|^s$. Also, consider that $F$ is the cdf of $X$. We can notice that, for every $r \geqslant s$, $g(x) > h(x)$. Then, by monotonicity of integrals, we have

$$ \int_{\mathbb{R}} |x|^r + 1 \, dF = \int_{\mathbb{R}} |x|^r\, dF + \underbrace{\int_{\mathbb{R}} 1 \, dF}_{=1} > \int_{\mathbb{R}} |x|^s \, dF $$

Since $\int_{\mathbb{R}} |x|^r\, dF = \mathbb{E}|X|^{r}<\infty$, then

$$ \infty > \mathbb{E}|X|^{r} + 1 > \int_{\mathbb{R}} |x|^s \, dF = \mathbb{E}|X|^{s} $$


There are some comments about using Jensen's Inequality to prove it, but it is wrong to use it, since it becomes a circular proof. From Casella & Berger:

Theorem 4.7.7 (Jensen's Inequality) For any random variable $X$, if $g(x)$ is a convex function, then

$$ \mathbb{E}[g(X)] \geqslant g(\mathbb{E}[X]) $$

This means that, to use this inequality, we must know that $\mathbb{E}[X] < \infty$.

In our case, by applying a function $\varphi: \mathbb{R} \to \mathbb{R}$ such that $\varphi(x) = x^{\frac{r}{s}}$ on $\mathbb{E}[|X|^s]$, we are assuming that $\mathbb{E}[|X|^s] < \infty$, which is exactly what we want to prove.