7

Given the variables $X$ and $Y$, which are correlated, $X\ge0$, $Y\ge0$ and each follow a gamma distribution with different shape parameters, i.e.,$X\sim\Gamma(a_1,\alpha)$ and $Y\sim\Gamma(a_2,\alpha)$. I understood that the Joint PDF $f_{X,Y}(x,y)$ can be obtained by making use of the McKay's bivariate Gamma distribution which applies for the case of different shape parameters.

$\mathbf{First}: $

McKay's PDF has a condition of $Y>X$ (and vice versa), does that mean that in this case there is no situation (in the event space) such that $Y=X$?

$\mathbf{Second}: $

If I want to obtain the following average of another function, denoted by $G(X,Y)$, i.e,

$\mathbb{E}[G(X,Y)]=\int^\infty_0\int^\infty_0 G(X,Y)f_{X,Y}(x,y)\mathrm{d}x\mathrm{d}y$

do I need to average the individual cases when $f_{X,Y}(x,y)$ applies for $X>y$ and $X<y$?

what about the case of $X=Y$?

Remy
  • 289
  • 2
    This s perhaps not a direct answer to the question, but rather a hint. If you are an R user, you might find it useful that the bivariate MacKay distribution is implemented in the VGAM library, using the function bivgamma.mackay. I am using it myself, and finding it very useful, although, as noted by Xi'an, it has the strong limitation that Y must be strictly greater than X. –  Sep 11 '13 at 21:33

2 Answers2

6

For McKay's distribution $X$ is a Gamma variate that is the sum of a subset of squares taken from the other, $Y$, which is the sum of a larger set of squares. Implying that $Y>X$ with probability $1$. See McKay's original paper:

McKay, A. T. (1934) Sampling from batches. Journal of the Royal Statistical Society—Supplement 1: 207–216.

paul
  • 61
5

You can create whole families of joint distributions on $(X,Y)$ such that $X\sim \Gamma(a_1,\alpha)$ and $Y\sim \Gamma(a_2,\beta)$ by using copulas like $$ F_{(X,Y)}(x,y) = \mathbb{P}(X\le x,Y\le y) = \dfrac{F_X(x)F_Y(y)}{1+\varrho (1-F_X(x))(1-F_Y(y))} $$ for $-1\le \varrho \le 1$. The joint distribution is continuous, which means the event $X=Y$ has probability zero.

Now, if you have a specific reason for using McKay's bivariate distribution, with pdf $$ f_{(X,Y)}(x,y) = \alpha^{p+q} x^{p-1} (y-x)^{q-1} \exp\{-\alpha y\} / [\Gamma(p) \Gamma(q)]\,\mathbb{I}_{0\le x\le y} \,, $$ which gives $$ X\sim \Gamma(p,\alpha)\,,\quad Y\sim \Gamma(p+q,\alpha) $$ as marginals, you must compute $\mathbb{E}[G(X,Y)]$ as $$ \int_0^\infty \int_0^y G(x,y)\,\alpha^{p+q} x^{p-1} (y-x)^{q-1} \exp\{-\alpha y\} / [\Gamma(p) \Gamma(q)]\,\text{d}x\,\text{d}y\,. $$

whuber
  • 322,774
Xi'an
  • 105,342
  • Thanks @ Xi'an, OK, so if I wanted to use Mckay's distribution to get $\mathbb{E}[G(X,Y)]$ I just need to average both cases when $X>y$ and when $X<y$? – Remy Feb 13 '12 at 14:55
  • can you please tell me about a good textbook that tells more about using copulas – Remy Feb 13 '12 at 15:02
  • 2
    You do not need a book to use copulas like the one above. Once you know a function that turns two uniforms into a joint distribution with uniform marginals, you are all set! A potential reference though is Introduction to copulas if you want to learn more. – Xi'an Feb 13 '12 at 15:25
  • 2
    McKay's distribution is a different approach to the problem since, indeed, $X<Y$ with probability $1$. If you compute the expectation, you thus need to constrain the integration domain to $X<Y$. The variables $X$ and $Y$ are not exchangeable. (Note that $X=Y$ is still a zero probability event.) – Xi'an Feb 13 '12 at 15:28
  • 1
    Thanks Xi'an, can you please explain more about constraining the integration domain? what the total expectation $\mathbb{E}[G(X,Y)]$ will look like? – Remy Feb 13 '12 at 16:08