1

How would I go about proving the following statement:

Given two random variables, $X$ and $Y$, if $E(h(X)g(Y)) = E(h(X))E(g(Y))$ for all bounded and continuous $h$ and $g$, then $X$ and $Y$ are independent.

I have no idea how to begin the proof. Please provide any pertinent references.

Jim
  • 165
  • 5
  • 2
    The title is a little mis-leading. Random variables $X$ and $Y$ are more than just uncorrelated -- they enjoy the stronger property that every bounded continuous function $g(X)$ and every bounded continuous function $h(Y)$ of $X$ and $Y$ are also uncorrelated random variables. – Dilip Sarwate Sep 09 '12 at 13:35
  • 2
    Hi, Jim. Welcome to the site! Is this, by chance, a homework exercise? If so, we should mark it with the [tag:homework] tag. They receive somewhat special treatment on this site. – cardinal Sep 09 '12 at 13:45
  • I suggest that you start by the definition of statistical independence: $\Pr(A \cap B) = \Pr(A)\cdot\Pr(B).$ – abaumann Sep 09 '12 at 11:59

2 Answers2

5

Looks like a measure theoretic homework. Here are some tips to construct your proof.

Work Backwards. Below you have suggestions for last, penultimate, antepenultimate, etc, steps.

The last step of the proof should be $$ P\{X\leq x,Y\leq y\} = P\{X\leq x\}P\{Y\leq y\} \, , $$ for every $x,y\in\mathbb{R}$. This is our goal because it is equivalent, as you may check in any measure theoretic probability textbook, to the usual definition of independence between $X$ and $Y$ in terms of the independence between the sigma-fields generated by $X$ and $Y$.

The penultimate step should be the same thing stated in terms of expectations and indicators: $$ \mathrm{E}\left[ I_{(-\infty,x]}(X)I_{(-\infty,y]}(Y)\right] = \mathrm{E}\left[I_{(-\infty,x]}(X)\right] \mathrm{E}\left[I_{(-\infty,y]}(Y)\right] \, . $$ Can you check that this implies the last step?

Now, if, for arbitrary ("fixed") $x,y\in\mathbb{R}$, we define $g(t)=I_{(-\infty,x]}(t)$ and $h(t)=I_{(-\infty,y]}(t)$, the penultimate step can be rewritten as $$ \mathrm{E}\left[ g(X)h(Y)\right] = \mathrm{E}\left[g(X)\right] \mathrm{E}\left[h(Y)\right] \, . $$ Unfortunately, we can't say that this holds by hypothesis, because our $g$ and $h$ are not continuous.

But, if you draw the graph of, for example, $g$, you will figure out that $g$ can be approximated (pointwise) by a (monotone) sequence $\{g_n\}_{n=1}^\infty$ of bounded continuous (nonnegative) functions.

enter image description here

So, building the analogous sequence $\{h_n\}_{n=1}^\infty$, we know, by hypothesis, that $$ \mathrm{E}\left[ g_n(X)h_n(Y)\right] = \mathrm{E}\left[g_n(X)\right] \mathrm{E}\left[h_n(Y)\right] \, , \qquad (*) $$ for every $n\geq 1$.

If we call the lhs of $(*)$ by $a_n$, and the rhs by $b_n$, we have two sequences of real numbers $\{a_n\}_{n=1}^\infty$ and $\{b_n\}_{n=1}^\infty$ such that $a_n=b_n$ for every $n\geq 1$. What can we say about the limits of the sequences $\{a_n\}_{n=1}^\infty$ and $\{b_n\}_{n=1}^\infty$?

Now, to finish (begin?) the proof: What theorems have you studied that allow you to interchange the order of taking limits with expectation? Will one of them do the job?

P.S. Of course, this "Work Backwards" idea is a heuristic strategy. At the end you must rewrite the proof in the right order, from beginning to end. Please, don't present it in the heuristic order $$ Z \Leftarrow \dots \Leftarrow B \Leftarrow A \, . $$ You have to present it formally, with all the details, as $$ A \Rightarrow B \Rightarrow \dots \Rightarrow Z \, . $$ Your teacher will not accept your "reversed" draft, probably.

Zen
  • 24,121
  • 2
    +1 A little detail: your sequence $g_n$ is not monotone. Maybe it needs a little adjusting... Re centering images: please see http://meta.stackexchange.com/questions/25835. (Bottom line: it's probably not worth it.) – whuber Sep 09 '12 at 21:08
  • 1
    (+1) @whuber and Zen: There are two choices one can make in the adjustment: (a) Adjust the picture (perhaps tedious) or (b) adjust the choice of theorem. :-) – cardinal Sep 09 '12 at 21:37
  • 1
    Thank you both, whuber and cardinal. I will take cardinal's "dominated", oops, sorry, I mean, easier fix. – Zen Sep 09 '12 at 21:44
  • @Zen, whuber, and cardinal, thanks for all your help!. Since I wrote the exercise 29.6 in the textbook, Probability and measure Billingsley, 3rd edition. the only one tip is "characteristic functions". I have no idea how to extend the proof to every bnd continuous g and h. Anyway, I got all the tips you provided, thanks again. – Jim Sep 10 '12 at 09:25
0

$$P(X \leq x) = E(1_{(-\infty, x]}(X)) \leq E(g_{x,\epsilon}(X)) \leq E(1_{(-\infty, x+\epsilon]}(X))=P(X \leq x+\epsilon)$$, where $$ g_{x,\epsilon}(z)=\left\{\begin{array}{clcr} 1,\mbox{ if }z\leq x\\ \mbox{linear, } \mbox{ if }x<z<x+\epsilon \\ 0, \mbox{ if }z\geq x+\epsilon.\end{array}\right.$$ Silmiarly, $$ P(Y \leq y) = E(1_{(-\infty, y]}(Y)) \leq E(h_{y,\epsilon}(Y)) \leq E(1_{(-\infty, y+\epsilon]}(Y))=P(Y \leq y+\epsilon)$$, where $$ h_{y,\epsilon}(z)=\left\{\begin{array}{clcr} 1,\mbox{ if }z\leq y\\ \mbox{linear, } \mbox{ if }y<z<y+\epsilon \\ 0, \mbox{ if }z\geq y+\epsilon.\end{array}\right.$$, and $$ P(X \leq x, Y \leq y) = E(1_{(-\infty, x]}(X)1_{(-\infty, y]}(Y)) \leq E(g_{x,\epsilon}(X)h_{y,\epsilon}(Y)) \leq E(1_{(-\infty, x+\epsilon]}(X)1_{(-\infty, y+\epsilon]}(Y))=P(X \leq x+\epsilon,Y \leq y+\epsilon).$$

Letting $\epsilon \to 0$, since $h$, and $g$ are bounded and continuous, by the dominated convergence theorem, $$E(g_{x,\epsilon}(X)) \to P(X \leq x) $$ $$E(h_{y,\epsilon}(Y)) \to P(Y \leq y) $$ $$E(g_{x,\epsilon}(X))E(h_{y,\epsilon}(Y))=E(g_{x,\epsilon}(X)h_{y,\epsilon}(Y)) \to P(X \leq x, Y \leq y). $$ Hence,

$$P(X \leq x)P(Y \leq y)\to P(X \leq x, Y \leq y), $$ i.e. X and Y are independent.

Jim
  • 165
  • 5