I don't think that $X_i$ are uniformly distributed.
What they say is that the probability space is $S = [0,1]$ and is equipped with the uniform distribution, which we shall denote by $\mathbb{P}$. Thus, for any interval $[a,b]\subset [0,1]$, $\mathbb{P}([a,b]) = b-a$.
To show that $X_n$ converges to $X$ in probability, one needs to show that for any $\epsilon>0$, $\lim_{n\to\infty} \mathbb{P}(|X_n-X|\ge \epsilon) = 0$. By definition, $\mathbb{P}(|X_n-X|\ge \epsilon) = \mathbb{P}(\{s\in [0,1]\,:\, |X_n(s)-X(s)|\ge \epsilon\})$. Now, by the definition of $X_n$, $X_n(s) = s + I_{[a_n,b_n]}(s)$ with some $0\le a_n<b_n\le 1$. In particular, $X_n(s)=X(s)$ for any $s\in [0,1]\setminus [a_n,b_n]$. Hence, for any $s\in [0,1]$, $|X_n(s)-X(s)|\ge \epsilon$ can only hold if $s\in [a_n,b_n]$ (otherwise $|X_n(s)-X(s)|=0<\epsilon$). In other words, $$\{s\in [0,1]\,:\, |X_n(s)-X(s)|\ge \epsilon \}\subset [a_n,b_n]$$ (in fact, equality also holds, but we do not need this for the proof.) It follows by the monotonicity and then the definition of $\mathbb{P}$ that $\mathbb{P}(|X_n-X|\ge \epsilon) \le \mathbb{P}([a_n,b_n]) = b_n-a_n$. From the construction of $X_n$, it is clear that $b_n-a_n\to 0$ as $n\to \infty$, finishing the proof.