Given two independent random variables $X\sim\Gamma(s,r)$ and $Y\sim\Gamma(t,u)$, what is the distribution of the difference, i.e. $D=X−Y$? I assume that $s$ and $t$ are integers. How can I obtain the skewness of the difference distribution?
1 Answers
Even if we don't have the distribution in closed form, the skewness we can get somewhere with.
For example:
$\,E(D^3) = E((X-Y)^3)\\ \quad\quad\quad= E(X^3)-3E(X^2Y)+3E(XY^2)-E(Y^3) \\ \quad\quad\quad= E(X^3) - 3E(X^2)E(Y)+3E(X)E(Y^2)-E(Y^3)$
and so on with the lower order moments. As a result, $E[(D-\mu_D)^3]$ and $E[(D-\mu_D)^2]$ can be derived and from that, the skewness of the difference. Alternatively,
$E[(D-\mu_D)^3]\\ \quad\quad=E[(X-Y-\{\mu_X-\mu_Y\})^3]\\ \quad\quad=E[(\{X-\mu_X\}-\{Y-\mu_Y\})^3]$
$\quad\quad=E[(X^*-Y^*)^3]\quad$ where here $^*$ indicates centered variables.
$\quad\quad= E(X^{*3}) - 3E(X^{*2})E(Y^{*})+3E(X^{*})E(Y^{*2})-E(Y^{*3})\quad$ as before
$\quad\quad= E(X^{*3}) - 3\text{Var}(X)\cdot 0+3\cdot 0\cdot\text{Var}(Y) -E(Y^{*3})\quad$
$\quad\quad= E(X^{*3}) -E(Y^{*3})\quad$
Similarly, $\text{Var}(D) = \text{Var}(X)+\text{Var}(Y)$.
Or, even more alternatively, we could have just relied on additivity of cumulants to arrive at the same results.
Consequently, $\gamma_1(D) = \frac{\mu_3(D)}{\mu_2(D)^{3/2}}= \frac{\mu_3(X)-\mu_3(Y)}{(\text{Var}(X)+\text{Var}(Y))^{3/2}}$
(Added additional detail for OP):
And of course, $\mu_3(X)=\gamma_1(X)\sigma_X^3$. So we have the skewness of the difference in terms of the skewness and standard deviation of the original variables - so far this is a general result; it doesn't rely on the variables being gamma random variables.
After that it's simple substitution.
- 282,281
self-studytag wiki info and add the tag. If not, why must you use the MGF? It's doable, but the integral is easy by comparison. But if you really want the kurtosis, you might consider the cumulant generating function instead. – Glen_b May 15 '14 at 18:54