3

Let $Y_1, Y_2, ..., Y_n$ be iid random variables and $B_1, B_2, ..., B_n$ be Borel sets. It follows that

$P(\bigcap_{i=1}^{n} (Y_i \in B_i)) = \Pi_{i=1}^{n} P(Y_i \in B_i)$...I think?

If so, does the converse hold true? My Stochastic Calculus professor says it does (or maybe misinterpreted him somehow?), but I was under the impression that independence of the n random variables was equivalent to saying for any indices $i_1, i_2, ..., i_k$ $P(\bigcap_{j=i_1}^{i_k} (Y_j \in B_j)) = \Pi_{j=i_1}^{i_k} P(Y_j \in B_j)$.

So, if the RVs are independent, then we can choose $i_j=j$ and k=n to get $P(\bigcap_{i=1}^{n} (Y_i \in B_i)) = \Pi_{i=1}^{n} P(Y_i \in B_i)$, but given $P(\bigcap_{i=1}^{n} (Y_i \in B_i)) = \Pi_{i=1}^{n} P(Y_i \in B_i)$, I don't know how to conclude that for any indices $i_1, i_2, ..., i_n$ $P(\bigcap_{j=i_1}^{i_k} (Y_j \in B_j)) = \Pi_{j=i_1}^{i_k} P(Y_j \in B_j)$, if that's even the right definition.

p.17 here seems to suggest otherwise. idk

Help please?

Also this:

enter image description here

or

enter image description here

So, this answer is to use the Omega part to establish pairwise independence and ultimately conclude independence. Without that assumption, we cannot conclude independence. Is that right? Why does that not contradict the definition of independence: $P(\bigcap_{i=1}^{n} (Y_i \in B_i)) = \Pi_{i=1}^{n} P(Y_i \in B_i)$ ?

BCLC
  • 2,424

1 Answers1

3

In the definition of mutual independence there are universal quantifiers for the Borel sets $B_i$. For example, if you have three mutually independent random variables, then $$ P(\{X_1\in B_1\}\cap\{X_2\in B_2\}\cap\{X_3\in B_3\}) = P\{X_1\in B_1\}P\{X_2\in B_2\}P\{X_3\in B_3\}\, , $$ for every Borel sets $B_1$, $B_2$, $B_3$. You don't have to impose additionally that, say, $$ P(\{X_1\in B_1\}\cap\{X_3\in B_3\}) = P\{X_1\in B_1\}P\{X_3\in B_3\}\, , $$ because this is equivalent to choosing $B_2=\mathbb{R}$ in the first formula.

Definition. The random variables $X_1,\dots,X_n$ are mutually independent if and only if $$ P\left( \bigcap_{i=1}^n \{ X_i\in B_i\}\right) = \prod_{i=1}^n P\{ X_i\in B_i\} \, , $$ for every Borel sets $B_1,\dots,B_n$.

Proposition. The random variables If $X_1,\dots,X_n$ are mutually independent if and only if for any choice of unique indexes $i_1,\dots,i_k\in\{1,\dots,n\}$ we have $$ P\left(\bigcap_{i\in\{i_1,\dots,i_k\}}\{X_i\in B_i\}\right) = \prod_{i\in\{i_1,\dots,i_k\}} P\{X_i\in B_i\} \, , $$ for every Borel sets $B_{i_1},\dots,B_{i_k}$.

Proof. Necessity is immediate. To prove sufficiency, remembering that $\{X_i\in\mathbb{R}\}=\Omega$, we have $$ \begin{align} P&\left(\bigcap_{i\in\{i_1,\dots,i_k\}}\{X_i\in B_i\}\right) \\ &= P\left(\left(\bigcap_{i\in\{i_1,\dots,i_k\}}\{X_i\in B_i\}\right) \cap \left(\bigcap_{j\in\{1,\dots,n\}\setminus\{i_1,\dots,i_k\}}\{X_j\in \mathbb{R}\}\right)\right) \\ &= \prod_{i\in\{i_1,\dots,i_k\}}P\{X_i\in B_i\} \times \prod_{j\in\{1,\dots,n\}\setminus\{i_1,\dots,i_k\}}P\{X_j\in \mathbb{R}\} \\ &= \prod_{i\in\{i_1,\dots,i_k\}}P\{X_i\in B_i\} \, . \end{align} $$

Zen
  • 24,121
  • I sort of get what you're saying, but isn't it that P(A int B int C) = P(A)P(B)P(C) doesn't imply pairwise independence? – BCLC Oct 06 '14 at 02:42
  • 2
    Yes, but you don't have the quantifiers "for every..." doing the trick in this case. For example, if I tell you that $P(A\cap B\cap C)=P(A)P(B)P(C)$ holds for every event $B$. May we say that $A$ and $C$ are independent? Yes, because the former equality must hold for $B=\Omega$. – Zen Oct 06 '14 at 03:16
  • Do you need a more detailed proof of the proposition in the answer? – Zen Oct 06 '14 at 03:18
  • Yes please? I guess my StoCal prof didn't write down "for every..." so I was confused. I think e said it but didn't write it down. – BCLC Oct 06 '14 at 07:08
  • Zen, there's this exercise in my book that sort of seems to contradict this. Help please? http://math.stackexchange.com/questions/956869/mutual-independence-definition-clarificaiton#comment1969237_957085 – BCLC Oct 06 '14 at 07:11
  • 1
    Is it clearer now? I believe this is what your Stochastic Calculus Professor had in mind. – Zen Oct 06 '14 at 12:35
  • 1
    Check the proof of the proposition carefully. It should answer that part of your question that says "but given... I don't know how to conclude...". – Zen Oct 06 '14 at 12:37
  • 1
    Your real difficulty may be more related to Logic, particularly the use of Universal Instantiation: if you assume at some point that $\forall x\in A, Q(x)$, then later in your argument if you have a particular instance $x^\in A$ you're allowed to conclude that $Q(x^)$. – Zen Oct 06 '14 at 13:07
  • About the exercise you mentioned, can you give us the name of the book, page, and number of the exercise? – Zen Oct 06 '14 at 17:05
  • Zen, Probability with Martingales Exercise 4.1 Thanks so much :) – BCLC Oct 06 '14 at 18:01
  • Logic was the problem. This answers it: 'Yes, but you don't have the quantifiers "for every..." doing the trick in this case. For example, if I tell you that P(A∩B∩C)=P(A)P(B)P(C) holds for every event B. May we say that A and C are independent? Yes, because the former equality must hold for B=Ω.' Thanks hahahaha – BCLC Aug 09 '15 at 13:13