Given $X_1$..$X_n$ and $Y_1$..$Y_n$ drawn from unknown distributions $F(x)$ and $G(x)$ respectively, statistical tests such as two-sample Kolomogorov-Smirnov, Cramer-von Mises, and Anderson-Darling tests have been devised to test the null hypothesis $\mathcal{H}: F(x) = G(x)$ using various test statistics.
But rather than hypothesis testing, I am more interested in quantifying the probability, $\mathcal{P}(X_1..Xn,Y_1..Y_n | F(x) = G(x))$ without knowing $F(x)$ or $G(x)$. Is this possible?
The above tests give $\mathcal{P}(Z\ge{}z | F(x) = G(x))$ where $Z$ is the test statistic, but this is not what I want.
I tried to find how the two-sample Kolomogorov test statistic was derived, but nothing useful came up... All the papers I have found either simply state the proposed test statistic or study its distribution under the null hypothesis.
Given order statistics $X_{(1)}$..$X_{(n)}$ and $Y_{(1)}$..$Y_{(n)}$, is there anything useful we can say about $X_{(i)} - Y_{(i)}$ irrespective of the distributions $F(x)$ and $G(x)$?