3

In calculating the expectation of a discrete random variable $X$, we not only require that $\Sigma x_iP(X=x_i)$ converges, but also converges absolutely. I understand this requirement as probably stemming from the fact that a rearrangement of non-absolutely-convergent countably-infinite sum can have a different sum.

I was wondering if a similar requirement exists for purely continuous random variable $X$, when we compute the expectation using the Riemann integral $\int_{-\infty}^\infty x f(x) dx$.

[I have a similar question for computing expectation using Stieltjes integrals - is there some sort of "absolute convergence" requirement?]

I understand that the most general definition of expectation involves Lebesgue integrals, but I am not very familiar with Lebesgue theory, so to be concrete (if you intend to reply via Lebesgue theory): In the special case of a purely continuous random variable, does the Lebesgue integral, when reduced to Riemann integral, have any form of "absolute convergence" requirement, or it is automatically satisfied in some sense? What about the case of Stieltjes integral? How does the "absolute convergence" requirement manifest itself for the discrete case?

Glen_b
  • 282,281

4 Answers4

5

If $X$ is a random variable taking real values, then you could define an integer random variable by rounding down: let $A = \lfloor X \rfloor$, and another slightly higher by rounding up $B=\lceil X \rceil$.

If any of them have an expectation then they all do as $E[X]-1 \le E[A] \le E[X] \le E[B] \le E[X]+1$.

$A$ and $B$ are both discrete random variables, for which you say you understand the need for absolute convergence in calculating the expectation. So a similar concept is needed for a continuous random variable to have a well-defined finite expected value.

Henry
  • 39,459
2

Essentially the answer is yes, though I am not quite sure about the question: perhaps you are thinking about something like the Cauchy principal value of your integral.

If you can describe $X$ as taking the value $Y$ with probability $p$ where $0\lt p \lt 1$ and the value $Z$ with probability $1-p$, where $E[Y]=-\infty$ and $E[Z]=+\infty$, then $X$ does not have an expected value, since if it did then it could be calculated as $pE[Y]+(1-p)E[Z]$.

A simple case would be to let $Y=X$ if $X \lt 0$ and $Y=0$ otherwise, with $Z=X$ if $X \ge 0$ and $Z=0$ otherwise, so $p=\Pr(X\lt 0)$.

Henry
  • 39,459
1

In short, a sufficient condition for an expectation $E[x]$ to be well defined is for $E[|x|]<\infty$. The main reason you need to be careful is basically because we want to be able to define

$$\lim_{H\to\infty}\int_{-kH}^{H}xf(x)dx$$

In an unambiguous way (ie independent of $k$ for all $k>0$). Now we know that $-|x|\leq x \leq |x|\implies -E|x|\leq E(x) \leq E|x| $, so if $E|x|<\infty$ then $E(x)$ is well defined.

A classic statistical example is the Cauchy distribution, where $f(x)=\frac{1}{\pi(1+x^2)}$. Now the anti-derivative of $xf(x)$ is given by $\frac{\log(1+x^2)}{2\pi}$ so the integral for finite $H$ is given as

$$ \int_{-kH}^{H}xf(x)dx=\frac{1}{2\pi}\log\left[\frac{1+H^2}{1+k^2H^2}\right]\to-\frac{1}{\pi}\log(k)$$

This depends on $k$ so we cannot give a definite meaning to the equation $\int_{-\infty}^{\infty}xf(x)$. Thus we say the expectation does not exist. Note that if we set $k=0$ we get $\infty$, showing that for a Cauchy $E(x|x>0)=\infty$ which implies $E|x|=\infty$.

Now we take a normal random variable and we have $xf(x)=\frac{x}{\sqrt{2\pi}}\exp\left(-\frac{x^2}{2}\right)$ which has anti-derivative $ -\frac{1}{\sqrt{2\pi}}\exp\left(-\frac{x^2}{2}\right)$. Plugging this in, we get $$ \frac{1}{\sqrt{2\pi}}\exp\left(-\frac{k^2H^2}{2}\right) -\frac{1}{\sqrt{2\pi}}\exp\left(-\frac{H^2}{2}\right)\to 0\;\;\;\forall k>0 $$

When $k=0$ the limit is $\frac{1}{\sqrt{2\pi}}$, showing that $E|x|=\sqrt{ \frac{2}{ \pi}}<\infty$.

  • 1
    I believe this argument, although it's on the right track, is invalid: you must consider the double limit $\lim_{H\to\infty, J\to-\infty}\int_{J}^{H}xf(x)dx$. It is possible for the single limit you consider to exist for all $k\gt 0$ yet for the double limit not to exist. – whuber Aug 23 '13 at 14:55
  • @whuber - My argument is that the single limit must both exist and be independent of $k$ (when $k>0$) for the expectation to be well defined. My Cauchy example is exactly like this - the single limit exists for all $k>0$ but the limit depends on $k$. – probabilityislogic Aug 23 '13 at 15:56
  • Regardless of the success in handling some special cases like the Cauchy, your characterization is insufficient: the single limit can exist and be independent of $k$ even when the integral itself does not converge. – whuber Aug 23 '13 at 15:58
  • @whuber - I am curious: would having absolute convergence guarantee that the double limit "exists" - in a sense that all way sof calculating the integral give the same value? To put it another way, if there is no absolute convergence, is there ambiguity in calculating the doubel limit? It seems to me that the convention seems to be doing $\int_{-\infty}^a + \int_a^{\infty} $ and so there owuld be no ambiguity there. – renrenthehamster Aug 25 '13 at 09:00
  • Oh silly me. If $\int_{-\infty}^a$ and $\int_a^{\infty}$ are finite, then the integral has to converge absolutely. Okay, ignore my comment above. – renrenthehamster Aug 25 '13 at 09:34
0

My textbook uses this proof and I think it's simple!

enter image description here

alan23273850
  • 101
  • 1