1

https://en.wikipedia.org/wiki/Convergence_of_random_variables

Wikipedia defines convergence in probability as follows:

A sequence $X_n$ of random variables converges in probability towards the random variable $X$ if for all $\epsilon$ > 0

$\lim_{n\to\infty}\Pr(|X_n-X|\gt\epsilon)=0$.

I don't know how to evaluate $X_n-X$ in particular. Is it necessary to consider the joint distribution of $X_n$ and $X$? I don't think so, but, then, I don't know what $X_n-X$ actually means.

Is the following evaluation correct if there are probability density functions $f_{X_n}$ of $X_n$ and $f_X$ of $X$?

\begin{align} \Pr(|X_n-X|\gt\epsilon)&=\int_{-\infty}^\infty\left(\int_{-\infty}^{x-\epsilon} f_{X_n}(y)dy+\int_{x+\epsilon}^\infty f_{X_n}(y)dy\right)f_X(x)dx \end{align}

EDIT

I just want to know how to calculate $\Pr$ in convergence in probability formula for given random variables. Usual calculation of $\Pr$ doesn't seem to converge to zero for most cases when $n \to \infty$. Even if $X_n$ equals to $X$!?

  • 2
    If $X$ is not a constant but a truly random variable, the joint distribution of $(X_n,X)$ must be considered, indeed. – Xi'an May 22 '18 at 13:21
  • 1
    "$X_n - X$" is, by definition, the sum of the random variables $X_n$ and $-X.$ See https://stats.stackexchange.com/questions/95993 for intuitive accounts of what a sum of random variables means. – whuber May 22 '18 at 13:24

1 Answers1

-1

I found out that the formal definition of the convergence in probability is as follows:

\begin{align} \forall\epsilon>0,\ \lim_{n\to\infty}\Pr(\omega|\ |X_n(\omega)-X(\omega)|\gt\epsilon)=0. \end{align}

And if you define sample space $\Omega$ as [0, 1], $X(\omega)$ becomes $F_X^{-1}(\omega)$ for $\omega \in \Omega$ with $F_X$ as the cumulative distribution function of $X$.

Therefore, the condition of the convergence in probability with cumulative distribution functions becomes as follows:

\begin{align} \forall\epsilon>0,\ \lim_{n\to\infty}\int_0^1 H\left(\left|F_{X_n}^{-1}(s)-F_X^{-1}(s)\right|-\epsilon\right)ds&=0. \end{align}

Where H(t) is the unit step function.

  • 1
  • note that the formal definition in the beginning of this answer is essentially 2. The construction $\Omega = [0,1],~X(\omega)=F_{X}^(-1)(\omega)$ does not make much sense if you want to consider more than 1 random variable (hint: can you eg represent two independent $U(0,1)$ random variables with this construction?). 3. Thus, the latter condition does not work in general.
  • – Juho Kokkala May 24 '18 at 18:59
  • Thanks a lot for the comment. If you need two independent random variables according to $U(0,1)$, you can use $X_1(\omega_1)=\omega_1$ and $X_2(\omega_2)=\omega_2$ with $\omega_1, \omega_2 \in [0, 1]$. Convergence of probability requires values of $X, X_n$ to be estimated at the same point $\omega$. – J. Fred May 25 '18 at 01:22
  • 1
    But I asked for random variables defined in the same probability space, there is no "different $\omega" for each random variable, whatever that would mean. In each ticket in https://stats.stackexchange.com/questions/50/what-is-meant-by-a-random-variable/54894#54894, a value of $X_1$ and a value of $X_2$ is written in the same ticket). I suspect this question and this mistaken answer) partially stem from unfamiliarity with the formal definitions of probability spaces and random variables. – Juho Kokkala May 25 '18 at 18:41
  • 1
    (Yes, you need to consider the joint distribution of $X_n$ and $X$ to evaluaate $P(|X_n-X|<\epsilon)$ while the last integral in the answer depends only on the marginal distributions) – Juho Kokkala May 25 '18 at 20:08
  • Thanks again. It is said that t-distribution with n freedom converges to the standard normal distribution when $n \to \infty$. Is there any joint distribution supposed between them? I can't see any and I'm confused. If $X \sim N(0,1)$, $X_n \sim t(n)$, and they are independent, I don’t think $Pr(|X_n - X| \gt \epsilon)$ converges to zero when $n \to \infty$. What kind of joint distribution is supposed behind them? – J. Fred May 25 '18 at 23:50
  • Is it said that those $X_n$s converge in probability to $X$? Do you have a reference ? Also, if that example is what prompted the question, you might want to edit that to the question itself – Juho Kokkala May 28 '18 at 17:18
  • I didn't think about that! Because they are well-known continuous functions. Do they converge in distribution but not converge in probability? https://en.wikipedia.org/wiki/Student%27s_t-distribution – J. Fred May 29 '18 at 11:17