6

Carlos Cinelli in this great post https://stats.stackexchange.com/a/384460/198058 gives an example of 3 different Data Generating Processes/Causal Models giving rise to the same joint distribution $(X,Y)$. Below is a snapshot of the 3 models taken from his post.

enter image description here

I was able to work out that for each of the 3 models $P(X)=P(Y)=N(0,1)$ but there is still quite a bit of work left to do to show that in these 3 models $(X,Y)$ have the same joint distribution. In model 1: $P(X,Y)=P(X)*P(Y|X)$ but what is $P(Y|X)$? In model 2: $P(X,Y)=P(Y)*P(X|Y)$ but what is $P(X|Y)$? Model 3 is even more complicated. I am hoping someone can explain all the steps we go through in order to show that these 3 models have the same joint distribution.

  • 2
    There's very little work needed apart from the (straightforward) algebra, because you have access to the simple formulas for linear combinations of jointly distribution Normal variables. The trick is not to work with conditional distributions: just write down the mean and covariance matrix. – whuber Mar 22 '23 at 21:56
  • Thank you for the hint. I will give it a try. – ColorStatistics Mar 22 '23 at 22:01
  • 1
    If I did this right, once we've shown that $P(X)=P(Y)=N(0,1)$ in each of the 3 models, using whuber's suggestion: $aX+bY$ ~ $N(aE(X)+bE(Y),a^2Var(X)+b^2Var(Y)+2abCov(X,Y)) = N(0,a^2+b^2+2abCov(X,Y))$ – ColorStatistics Mar 22 '23 at 23:01

1 Answers1

9

All of these DGPs generate a standard bivariate Gaussian with:

$$ \begin{pmatrix} X \\ Y \end{pmatrix}\sim N\left(\begin{pmatrix} 0 \\ 0 \end{pmatrix},\begin{pmatrix} 1 & \sigma_{xy} \\ \sigma_{xy} & 1 \end{pmatrix}\right). $$

You can verify that by first recognizing that $X$ and $Y$ are linear combinations of Gaussians and thus jointly Gaussian, and then simply computing the means, variances and covariance for each case.

For the means, it is easy to verify that $E[Y]= E[X] = 0$ in each case. For the variances and covariances, make use of the fact that the disturbances $U$ are mutually independent.

Case 1 $$ Var(X) = Var(U_x) = 1\\ Var(Y) = \sigma_{yx}^2Var(X) + Var(U_y )= \sigma_{yx}^2 + (1-\sigma_{yx}^2) = 1\\ Cov(Y, X) = \sigma_{yx} $$

Case 2

Similar to Case 1, just with the roles of $X$ and $Y$ reversed.

Case 3

Note that $Var(Z)=1$. Thus,

$$ Var(X) = \alpha^2Var(Z) + Var(U_x) = \alpha^2 + (1-\alpha^2) = 1\\ Var(Y)= \beta^2 Var(Z) + Var(U_y) = \beta^2 + (1-\beta^2) = 1\\ Cov(Y, X)= Cov(\beta Z + U_y,\alpha Z + U_x) = \alpha \beta \times Var(Z)= \alpha\beta = \sigma_{xy} $$