2

In Casella-Berger Statistical Inference page 234 Example 5.5.8, they define a sequence of uniform random variables $X_1, X_2, \cdots, X_n, \cdots$ such that $X_i \sim U(0,1)$ and $s \in [0,1]$ and :

$X_1(s) = s + I_{[0,1]}(s)$, $X_2(s) = s + I_{[0,\frac{1}{2}]}(s)$, $X_3(s) = s + I_{[\frac{1}{2},1]}(s)$

$X_4(s) = s + I_{[0,\frac{1}{3}]}(s)$, $X_5(s) = s + I_{[\frac{1}{3},\frac{2}{3}]}(s)$, $X_6(s) = s + I_{[\frac{2}{3},1]}(s)$

and so on ...

Then $X(s) = s$

Why does $X_n \to X$ converge in probability ?

t-student
  • 113

2 Answers2

2

They explain why in their example but you can also see that by applying the definition of convergence in probability and the fact that $S$ is uniformly distributed in $[0,1]$. Specifically, we have that

$$ \Pr\left( |X_n (s) - X(s)| <\epsilon \right) \iff \Pr\left( S \in \left[\alpha_n, \beta_n \right] < \epsilon \right) = \Pr \left( \beta_n - \alpha_n < \epsilon \right) \to 1$$

by construction since the interval becomes a null set in the limit. Hence we have convergence in probability. We note, however, that since the random variable will be in the intervals infinitely many times, no matter how small the intervals, there is no almost sure convergence. It is precisely this difference that their example aims to demonstrate.

JohnK
  • 20,366
0

I don't think that $X_i$ are uniformly distributed. What they say is that the probability space is $S = [0,1]$ and is equipped with the uniform distribution, which we shall denote by $\mathbb{P}$. Thus, for any interval $[a,b]\subset [0,1]$, $\mathbb{P}([a,b]) = b-a$.

To show that $X_n$ converges to $X$ in probability, one needs to show that for any $\epsilon>0$, $\lim_{n\to\infty} \mathbb{P}(|X_n-X|\ge \epsilon) = 0$. By definition, $\mathbb{P}(|X_n-X|\ge \epsilon) = \mathbb{P}(\{s\in [0,1]\,:\, |X_n(s)-X(s)|\ge \epsilon\})$. Now, by the definition of $X_n$, $X_n(s) = s + I_{[a_n,b_n]}(s)$ with some $0\le a_n<b_n\le 1$. In particular, $X_n(s)=X(s)$ for any $s\in [0,1]\setminus [a_n,b_n]$. Hence, for any $s\in [0,1]$, $|X_n(s)-X(s)|\ge \epsilon$ can only hold if $s\in [a_n,b_n]$ (otherwise $|X_n(s)-X(s)|=0<\epsilon$). In other words, $$\{s\in [0,1]\,:\, |X_n(s)-X(s)|\ge \epsilon \}\subset [a_n,b_n]$$ (in fact, equality also holds, but we do not need this for the proof.) It follows by the monotonicity and then the definition of $\mathbb{P}$ that $\mathbb{P}(|X_n-X|\ge \epsilon) \le \mathbb{P}([a_n,b_n]) = b_n-a_n$. From the construction of $X_n$, it is clear that $b_n-a_n\to 0$ as $n\to \infty$, finishing the proof.

Csaba
  • 173
  • Thanks for the answer but I am still confused ! I agree that for $X_1, X_4, \cdots$ and $X_3, X_6, ..$. their $[a_n, b_n] \to 0$. But for $X_2, X_5, \cdots$, their $[a_n, b_n] \to 1$. So even the sequence of $P(|X_n - X| > \epsilon)$ alternates between 0 and 1. – t-student Apr 28 '17 at 06:18
  • I lost you. I think you may be misunderstanding the construction of $X_n$. The $(X_n)$ come in groups of size $1$, $2$, $3$, $4$, etc. Letting the group index be $i=1,2,\dots$, the size of group $i$ is $i$, and the indices that – Csaba Apr 29 '17 at 18:51
  • I prematurely pressed enter.. Sorry. Here is the full thing: Consider the partitioning of ${1,2,\dots}$ of indices of $(X_n)$ into the groups $G_1 = { 1 }$, $G_2 = {2,3}$, $G_3 = {4,5,6}$, $G_4 = {7,8,9,10}$, $\dots$. Group $G_i$ has $i$ elements. Then, for any $n$, there is a unique group $i\doteq i_n=\Omega(\sqrt{n})$ such that $n\in G_i$. Defining $j = n-\min G_i-1$, $0\le j \le i-1$ and $X_n(s) = s + I_{[j/i,(j+1)/i]}(s)$. Thus, $a_n = j/i$ and $b_n = (j+1)/i$. Hence, $b_n - a_n = 1/i=1/i_n \to 0$ as $n\to\infty$. – Csaba Apr 29 '17 at 18:59