https://en.wikipedia.org/wiki/Convergence_of_random_variables
Wikipedia defines convergence in probability as follows:
A sequence $X_n$ of random variables converges in probability towards the random variable $X$ if for all $\epsilon$ > 0
$\lim_{n\to\infty}\Pr(|X_n-X|\gt\epsilon)=0$.
I don't know how to evaluate $X_n-X$ in particular. Is it necessary to consider the joint distribution of $X_n$ and $X$? I don't think so, but, then, I don't know what $X_n-X$ actually means.
Is the following evaluation correct if there are probability density functions $f_{X_n}$ of $X_n$ and $f_X$ of $X$?
\begin{align} \Pr(|X_n-X|\gt\epsilon)&=\int_{-\infty}^\infty\left(\int_{-\infty}^{x-\epsilon} f_{X_n}(y)dy+\int_{x+\epsilon}^\infty f_{X_n}(y)dy\right)f_X(x)dx \end{align}
EDIT
I just want to know how to calculate $\Pr$ in convergence in probability formula for given random variables. Usual calculation of $\Pr$ doesn't seem to converge to zero for most cases when $n \to \infty$. Even if $X_n$ equals to $X$!?