7

Consider the following two random variables:

$$Z_1=U_1-X_1$$ and $$Z_2=U_2-X_1,$$ where $U_1$ and $U_2$ are two i.i.d random variables following a general distribution, and $X_1$ is an exponential random variable. It is obvious that $Z_1$ and $Z_2$ are two dependent random variables. Now, let us consider the following random variable: $$Y=\min(Z_1,Z_2).$$ Is there any explicit expression or approximation for the PDF or expected value of $Y$?

User1865345
  • 8,202

2 Answers2

10

The distribution of $Y$ can also be written as $$Y = \min(Z_1,Z_2) = \min(U_1,U_2) - X_1 = Q-X_1$$

This contains two parts:

  • The distribution of a minimum of iid variables $Q=\min(U_1,U_2)$ whose cdf can be expressed in terms of the cdf of $U_i$, $$P(\min(U_1,U_2)> x) = P(U_i > x)^2$$
  • The distribution of a sum $Q-X_1$, which can be found with a convolution.
6

As noted by @YashaswiMohanty the expectation of $Y$ can sometimes be found without explicting the probability distribution function.

Assume that all r.vs are of continuous type and $X_1$ is exponential with rate $\lambda >0$. We can consider the survival function $\bar{F}_Y(y) := 1 - F_Y(y)$

$$ \bar{F}_Y(y) = \text{Pr}\{ \min(Z_1,\,Z_2) > y\} =\text{Pr}\{[Z_1 > y] \cap [Z_2 > y] \}. $$ Then by conditioning on $X_1$ we can use the independence \begin{align*} \bar{F}_Y(y) &=\int_0^\infty \text{Pr}\{[Z_1 > y] \cap [Z_2 > y] \, \vert \, X_1 = x_1\} f_{X_1}(x_1) \,\text{d}x_1\\ &= \int_0^\infty \text{Pr}\{[U_1 > y + x_1] \cap [U_2 > y + x_1] \, \vert \, X_1 = x_1\} \, f_{X_1}(x_1) \,\text{d}x_1\\ &= \int_0^\infty \bar{F}_U(y + x_1)^2 \lambda \, e^{-\lambda x_1}\, \text{d}x_1 \end{align*}

There are some cases where we can get a closed form expression. For instance if $U_i$ are exponential with rate $\gamma$ i.e., $\bar{F}_U(u) = e^{-\gamma u}$ for $u >0$.

Interestingly, this is a simple and efficient way to generate a couple of random variables with tail dependence.

Yves
  • 5,358