4

Let $X_1,\cdots,X_n$ and $Y_1,\cdots,Y_n$ are iid from $X\sim \operatorname{Exponential}(\lambda)$ and $\sim\operatorname{Bernoulli}(p)$ respectively, where $X$ and $Y$ are independent.

Given $n,\lambda,p$ and $c$, we consider computing $$ P\left(\sum_{i=1}^nX_iY_i > c \right) $$ Let $S=\sum_{i=1}^nS_i=\sum_{i=1}^nX_iY_i$. I try to derive its density and then I compute the underlying probability by integral, similar as Edmond Levy with the hypo-exponential density. However, I can not get the density of $S_i$; see also here and here for discussions. Any suggestions?

John Stone
  • 239
  • 1
  • 5
  • 2
    $$ \begin{align} & \Pr\left(\sum_{i=1}^nX_iY_i > c \right) = \operatorname E\left( \Pr\left( \sum_{i=1}^n X_i Y_i>c ,, \Bigg\vert,, \sum_{i=1}^n Y_i \right) \right) \ {} \ = {} & \sum_{j=0}^n \Pr \left( \sum_{i=1}^n X_i Y_i >c ,, \Bigg\vert,, \sum_{i=1}^n Y_i = j \right) \Pr\left( \sum_{i=1}^n Y_i = j \right) \ {} \ = {} & \sum_{j=0}^n \Pr\left( \operatorname{Gamma}(j,\lambda)>c \right) \binom n j p^j (1-p)^{n-j} \ {} \ = {} & \sum_{j=0}^n \int_c^\infty \frac 1{\Gamma(j)} ( \lambda u)^{j-1} e^{-\lambda u} (\lambda,du) \binom n j p^j(1-p)^{n-j} \end{align} $$ $\cdots$ – Michael Hardy Oct 04 '23 at 03:53
  • 2
    I might start like that and go on from there. Maybe I'll post something later. – Michael Hardy Oct 04 '23 at 03:54
  • 2
    When $\sum Y_i = 0$, I think that $S$ must be defined as zero (empty sum) so the distribution of $S$ has an atom at $S=0$. If so, for $c$ positive and finite $\text{Pr}(S > c) =0$. – Yves Oct 04 '23 at 05:17
  • @MichaelHardy Great math formula derivation! – John Stone Oct 04 '23 at 06:53
  • @Yves Tks for your explanation! – John Stone Oct 04 '23 at 06:53
  • @MichaelHardy I would like to mark your answer as the selected one, while it is just a comment... – John Stone Oct 04 '23 at 06:56
  • 1
    The "internal" integral ($\int_c^{\infty } \lambda \exp (-\lambda u) (\lambda u)^{j-1} , du$) evaluates to $\Gamma (j,c \lambda )$. Plugging that in and finding that the term with $j=0$ is zero, the general result is $\sum _{j=1}^n \frac{p^j \binom{n}{j} (1-p)^{n-j} \Gamma (j,c \lambda )}{\Gamma (j)}$. For $n=1,2,3$ this simplifies to $p e^{-c \lambda }$, $p e^{-c \lambda } (p (c \lambda -1)+2)$, and $\frac{1}{2} p e^{-c \lambda } (p (6 c \lambda +p (c \lambda (c \lambda -4)+2)-6)+6)$, respectively. And, yes, @MichaelHardy did all of the heavy lifting. – JimB Oct 04 '23 at 15:01
  • 2
    This is a mixture of Gamma distributions $Ga(k,\lambda)$ with the mixture weights being Binomial $\mathcal B(n,p)$. – Xi'an Oct 04 '23 at 16:23

2 Answers2

2

Here is a less-than-complete answer: \begin{align} & \Pr\left(\sum_{i=1}^nX_iY_i > c \right) = \operatorname E\left( \Pr\left( \sum_{i=1}^n X_i Y_i>c \,\, \Bigg\vert\,\, \sum_{i=1}^n Y_i \right) \right) \\ {} \\ = {} & \sum_{j=0}^n \Pr \left( \sum_{i=1}^n X_i Y_i >c \,\, \Bigg\vert\,\, \sum_{i=1}^n Y_i = j \right) \Pr\left( \sum_{i=1}^n Y_i = j \right) \\ {} \\ = {} & \sum_{j=0}^n \Pr\left( \operatorname{Gamma}(j,\lambda)>c \right) \binom n j p^j (1-p)^{n-j} \\ {} \\ = {} & \sum_{j=0}^n \int_c^\infty \frac 1{\Gamma(j)} ( \lambda u)^{j-1} e^{-\lambda u} (\lambda\,du) \binom n j p^j(1-p)^{n-j} \\ {} \\ = {} & \sum_{j=0}^n \frac 1 {\Gamma(j)} \binom n j p(1-p)^{n-1} \int_c^\infty \left( \frac{\lambda up}{1-p} \right)^{j-1} e^{-\lambda u} (\lambda\, du) \end{align}

1

To add to @MichaelHardy 's answer the "internal" integral is an incomplete gamma:

$$\int_c^{\infty } \lambda \exp (-\lambda u) (\lambda u)^{j-1} \, du=\Gamma(j,c \lambda)$$

So the desired probability is

$$ \Pr\left(\sum_{i=1}^nX_iY_i > c \right) =\sum_{j=1}^n {\binom{n}{j}}p^j (1-p)^{n-j} \Gamma(j,c \lambda)/\Gamma(j)$$

Note that the sum on the righthand side starts at $j=1$ rather than $j=0$ as

$$\Pr\left(\sum_{i=1}^n X_i Y_i > c \,\, \Bigg\vert\,\, \sum_{i=1}^n Y_i=0\right)=0$$

If one doesn't have access to an incomplete gamma function, then the following is an equivalent result

$$\ \Pr\left(\sum_{i=1}^nX_iY_i > c \right) =e^{-c \lambda} \sum _{j=1}^n \binom{n}{j} p^j (1-p)^{n-j} \sum _{i=0}^{j-1} \frac{(c \lambda )^i}{i!}$$

As a partial check suppose $n=7$, $p=1/3$, $\lambda=2$, and $c=1$. The formula gives $\frac{348049}{98415 e^2}\approx 0.478619$. Some R code to perform simulations:

p <- 1/3
c <- 1
lambda <- 2
n <- 7
nsim <- 1000000

set.seed(12345) z <- rowSums(matrix(rexp(nnsim, lambda)rbinom(n*nsim, 1, p), nrow=nsim)) prob <- length(z[z>c])/nsim

[1] 0.478865

For small values of $n$ one can construct the following table:

Table of probabilities for small values of n

JimB
  • 3,734
  • 11
  • 20
  • Yes, you're right! I like your R simulation parts! Actually, we can use sapply and pgamma in R to compute its exact solution according to the clues of you and Michael Hardy as 0.4786192. Tks!@JimB – John Stone Oct 08 '23 at 14:36