I have two independent 2-dimensional normal distributions with the same mean vector and different covariance matrices, lets say $X_1 \sim N_2( \mu, C)$ and $X_2 \sim N_2(\mu, C')$.
How can I prove that the random vector $X_1-X_2$ is itself normal too, with the zero vector as the mean (that's so easy) and with its covariance matrix given by $C+C'$ ?.
I suppose using the convolution Mathew has mentioned is similar to treat with moment-generating functions (show how the moment-generating function of the difference of 2-dimension normal distribution is the moment-generating function of the variable wanted ) and I have got it through this approach.
However, I think it's easier to handle the issue with Dilip approach (now I realise what was the key of the problem hehehe) , since it does not require using further functions, only some basic algebra results, making it more intuitive.
An additional question is what do you think is the more convenient approach to solve the question in my end-of-degree project, the first or the second?