16

I have a question about calculation of conditional density of two normal distributions. I have random variables $X\mid M \sim \text{N}(M,\sigma^2)$ and $M \sim \text{N}(\theta, s^2)$, with conditional and marginal densities given by:

$$\begin{equation} \begin{aligned} f(x|m) &= \frac{1}{\sigma \sqrt{2\pi}} \cdot \exp \Big( -\frac{1}{2} \Big( \frac{x-m}{\sigma} \Big)^2 \Big), \\[10pt] f(m) &= \frac{1}{s \sqrt{2\pi}} \cdot \exp \Big( - \frac{1}{2} \Big( \frac{m-\theta}{s} \Big)^2 \Big). \end{aligned} \end{equation}$$

I would like to know the marginal distribution of $X$. I have multiplied the above densities to form the joint density, but I cannot successfully integrate the result to get the marginal density of interest. My intuition tells me that this is a normal distribution with different parameters, but I can't prove it.

  • In second line, $M\sim$ should be $m\sim$. Cannot edit for 1 character. – user158565 Oct 16 '18 at 03:35
  • Tangentially on-topic, but you might want to read about conjugate priors - the normal distribution is the conjugate prior for the mean of a normal distribution. – stuart10 Oct 16 '18 at 08:10

5 Answers5

13

Your intuition is correct - the marginal distribution of a normal random variable with a normal mean is indeed normal. To see this, we first re-frame the joint distribution as a product of normal densities by completing the square:

$$\begin{equation} \begin{aligned} f(x,m) &= f(x|m) f(m) \\[10pt] &= \frac{1}{2\pi \sigma s} \cdot \exp \Big( -\frac{1}{2} \Big[ \Big( \frac{x-m}{\sigma} \Big)^2 + \Big( \frac{m-\theta}{s} \Big)^2 \Big] \Big) \\[10pt] &= \frac{1}{2\pi \sigma s} \cdot \exp \Big( -\frac{1}{2} \Big[ \Big( \frac{1}{\sigma^2}+\frac{1}{s^2} \Big) m^2 -2 \Big( \frac{x}{\sigma^2} + \frac{\theta}{s^2} \Big) m + \Big( \frac{x^2}{\sigma^2} + \frac{\theta^2}{s^2} \Big) \Big] \Big) \\[10pt] &= \frac{1}{2\pi \sigma s} \cdot \exp \Big( -\frac{1}{2 \sigma^2 s^2} \Big[ (s^2+\sigma^2) m^2 -2 (x s^2+ \theta \sigma^2) m + (x^2 s^2+ \theta^2 \sigma^2) \Big] \Big) \\[10pt] &= \frac{1}{2\pi \sigma s} \cdot \exp \Big( - \frac{s^2+\sigma^2}{2 \sigma^2 s^2} \Big[ m^2 -2 \cdot \frac{x s^2 + \theta \sigma^2}{s^2+\sigma^2} \cdot m + \frac{x^2 s^2 + \theta^2 \sigma^2}{s^2+\sigma^2} \Big] \Big) \\[10pt] &= \frac{1}{2\pi \sigma s} \cdot \exp \Big( - \frac{s^2+\sigma^2}{2 \sigma^2 s^2} \Big( m - \frac{x s^2 + \theta \sigma^2}{s^2+\sigma^2} \Big)^2 \Big) \\[6pt] &\quad \quad \quad \text{ } \times \exp \Big( \frac{(x s^2 + \theta \sigma^2)^2}{2 \sigma^2 s^2 (s^2+\sigma^2)} - \frac{x^2 s^2 + \theta^2 \sigma^2}{2 \sigma^2 s^2} \Big) \\[10pt] &= \frac{1}{2\pi \sigma s} \cdot \exp \Big( - \frac{s^2+\sigma^2}{2 \sigma^2 s^2} \Big( m - \frac{x s^2 + \theta \sigma^2}{s^2+\sigma^2} \Big)^2 \Big) \cdot \exp \Big( -\frac{1}{2} \frac{(x-\theta)^2}{s^2+\sigma^2} \Big) \\[10pt] &= \sqrt{\frac{s^2+\sigma^2}{2\pi \sigma^2 s^2}} \cdot \exp \Big( - \frac{s^2+\sigma^2}{2 \sigma^2 s^2} \Big( m - \frac{x s^2 + \theta \sigma^2}{s^2+\sigma^2} \Big)^2 \Big) \\[6pt] &\quad \times \sqrt{\frac{1}{2\pi (s^2+\sigma^2)}} \cdot \exp \Big( -\frac{1}{2} \frac{(x-\theta)^2}{s^2+\sigma^2} \Big) \\[10pt] &= \text{N} \Big( m \Big| \frac{xs^2+\theta\sigma^2}{s^2+\sigma^2}, \frac{s^2 \sigma^2}{s^2+\sigma^2} \Big) \cdot \text{N}(x|\theta, s^2+\sigma^2). \end{aligned} \end{equation}$$

We then integrate out $m$ to obtain the marginal density $f(x) = \text{N}(x|\theta, s^2+\sigma^2)$. From this exercise we see that $X \sim \text{N}(\theta, s^2+\sigma^2)$.

Ben
  • 124,856
  • (+1), would this have been simpler using the moment generating function (MGF)? – SecretAgentMan Oct 16 '18 at 05:01
  • 2
    @SecretAgentMan: Thanks for the up-votes. The OP said he was having trouble integrating the joint density, so that is the method I have used. – Ben Oct 16 '18 at 05:03
  • 1
    @Ben , poor word choice on my part. I didn't mean to criticize you. My comment was for anyone that looks at this in the future. Agree the OP asked re: density. Future people needing help should know alternative approach with MGF might be simpler. Again, I wan't trying to criticize you, your approach, or the OP. – SecretAgentMan Oct 16 '18 at 16:15
  • @SecretAgentMan: That is totally fine - I didn't take it as a criticism. Since you have another method in mind, perhaps the best thing would be to give it as an alternative answer? (That way people can see exactly what you have in mind.) – Ben Oct 16 '18 at 22:28
  • @Ben, agree. If I can figure it out, I will. I'm not all that good but my intuition tells me that would work. That's probably dangerous. Will post if I can get it. – SecretAgentMan Oct 17 '18 at 05:45
  • @DavidRefaeli: Sorry to have made you waste half-an-hour! Thanks for pointing that out --- I have edited to correct. – Ben Jul 27 '20 at 10:01
  • 1
    @Ben-ReinstateMonica no problem, you saved me 200 hours if I would to try this alone. So you're still in a positive balance ;-) – Maverick Meerkat Jul 27 '20 at 10:12
9

Let $$X = m +\epsilon$$ where $m \sim N(\theta,s^2)$ and $\epsilon \sim N(0,\sigma^2)$ and they are independent.

Then $X|m$ and $m$ follows the distributions specified in the question.

$E(X)=E(m) = \theta$

$Var(X) = Var(m) +Var(\epsilon) = s^2+\sigma^2$

According to "The sum of random variables following Normal distribution follows Normal distribution", and the normal distribution is determined by mean and variance, we have

$$X \sim N(\theta, s^2+\sigma^2)$$

user158565
  • 7,461
  • Ok thanks, I understand your answer but I would like to have a mathematical proof fo your first line X=m+ϵ. Is it possible ? – Mangnier Loïc Oct 16 '18 at 02:29
  • In math prove, some time to need to create some thing. For example, + x - x, divides by a then times a.... So here we need to construct a X, such that 1) the original properties are kept and 2) give us the convenience to proving. Do you mean to prove "Then $X|m$ and $m$ follows the distributions specified in the question."? – user158565 Oct 16 '18 at 02:38
  • Thank you I understand your POV. It's a smart way to answer to my question. – Mangnier Loïc Oct 16 '18 at 11:48
  • sorry ... point of view – Mangnier Loïc Oct 17 '18 at 19:24
5

You have

\begin{align} X\mid M & \sim \operatorname N(M, \sigma^2) \\[6pt] M & \sim \operatorname N(\theta,s^2) \end{align}

Consequently

$$(X-M)\mid M\sim\operatorname N(0,\sigma^2).$$

Observe that the conditional distribution of $X-M$ given $M$ does not depend on $M,$ since no $\text{“}M\text{”}$ appears in $\text{“}\operatorname N(0,\sigma^2).\text{”}$

Two consequences follow:

  • $X-M$ is independent of $M,$ and
  • the marginal (i.e. “unconditional”) distribution of $X-M$ is $\operatorname N(0,\sigma^2).$

Thus $X-M$ and $M$ are normally distributed and independent of each other.

Therefore their sum, $X,$ is normally distributed and its expectation and variance are the respective sums of those of $X-M$ and $M$.

So $X\sim\operatorname N(\theta, s^2+\sigma^2).$

(This omits any proof that the sum of independent normals is normal. For that, you can compute a convolution.)

  • This sounds a little circular, unless I missed something: doesn't your first consequence require you initially to show $(X-M,M)$ has a bivariate Normal distribution? – whuber Jan 24 '23 at 21:39
  • @whuber : How so? If you condition on $M$, you treat $M$ as a constant in that context, and when a constant is added to a random variable, the same constant is added to its expected value, and its variance remains the same. And if a constant is added to a normally distributed random variable, the sum is normally distributed. (And notice my parenthetical comment at the end.) – Michael Hardy Jan 26 '23 at 01:50
  • From that you can conclude the marginal is a mixture of Normals, but I don't see why bivariate Normality is immediately implied. Maybe I'm missing something obvious. – whuber Jan 26 '23 at 15:21
  • @whuber : The marginal is a sum of independent normals. As I said, I did not include a proof that a sum of independent normals is normal, but said it can be done by computing a convolution. – Michael Hardy Jan 26 '23 at 19:00
4

Here's a solution using moment generating functions, as suggested by @SecretAgentMan, that also ties in with the very slick answer provided by @user158565. If you like, you can view this as an (overly) rigorous justification of the decomposition provided by @user158565.

Let $M \sim N(\theta, s^2)$ and $X|M \sim N(M,\sigma^2)$. We are asked to find the unconditional distribution of $X$. To this end, define $\varepsilon \equiv X - M$. We show that $\varepsilon \sim N(0, \sigma^2)$ by calculating its moment generating function (mgf). We have \begin{align*} \mathbb{E}\left[e^{t\varepsilon}\right] &= \mathbb{E}\left[\exp\left\{t(X-M)\right\}\right] = \mathbb{E}\left[\mathbb{E}\left(\left. e^{tX}e^{-tM}\right|M\right) \right]\\ &= \mathbb{E}\left[e^{-tM}\mathbb{E}\left(\left. e^{tX}\right|M\right) \right] = \mathbb{E}\left[ e^{-tM} \exp\left\{tM + \frac{1}{2}\sigma^2 t^2 \right\}\right]\\ &= \mathbb{E}\left[ \exp\left\{\frac{1}{2}\sigma^2 t^2 \right\}\right] \end{align*} using iterated expectations and the fact that $X|M$ is a normal random variable with mean $M$ and variance $\sigma^2$, so that its moment generating function is $\exp\left\{ tM + \frac{1}{2}\sigma^2 t^2 \right\}$. We recognize $\mathbb{E}[e^{t\varepsilon}]$ as the mgf of a normal random variable, hence $\varepsilon \sim N(0,\sigma^2)$. Next, we show that $M$ and $\varepsilon$ are independent by showing that their joint mgf equals the product of the respective marginal mgfs. Again using iterated expectations and the mgf of $X|M$, we have \begin{align*} \mathbb{E}\left[ \exp\left\{ t_1 M + t_2 \varepsilon \right\} \right] &= \mathbb{E}\left[ \exp\left\{ t_1 M + t_2(X -M) \right\} \right] = \mathbb{E}\left[ \exp\left\{(t_1 - t_2)M + t_2 X\right\} \right]\\ &= \mathbb{E}\left[ \exp\left\{(t_1 - t_2)M\right\} \mathbb{E}\left(\left. e^{t_2 X}\right|M \right) \right]\\ &= \mathbb{E}\left[ \exp\left\{(t_1 - t_2)M\right\} \exp\left\{ t_2 M + \frac{1}{2}\sigma^2 t_2^2 \right\} \right]\\ &= \mathbb{E}\left[ \exp\left\{ t_1 M + \frac{1}{2}\sigma^2 t_2^2 \right\} \right]\\ &= \mathbb{E}\left[e^{t_1 M}\right] \exp\left\{\frac{1}{2}\sigma^2 t_2^2 \right\} \\ &= \mathbb{E}\left[e^{t_1 M}\right] \mathbb{E}\left[ e^{t_2 \varepsilon} \right] \end{align*} as claimed. We have shown that $X = M + \varepsilon$ where $\varepsilon \sim N(0, \sigma^2)$ independently of $M \sim N(\theta, s^2)$. It follows that $X \sim N(\theta, s^2 + \sigma^2)$.

inhuretnakht
  • 201
  • 1
  • 7
0

Finally I found a easier solution of my problem by the MGF, without to much calculations.

Let assume $$X_1, X_2, \ldots, X_n $$ which follow a specific law

The MGF of this variables is $$ M(t_1,t_2,\ldots,t_n) = E[\exp(t_1X_1 + t_2X_2 + \cdots + t_nX_n)] $$

In a similar way we can define moments in a conditional approach like this:

$$ E[X\mid M]=\int xf(x\mid M) \, dx. $$

Using this, we can prove that: $$ X \sim N(\theta, s^2+ \sigma^2)$$