7

Consider two Bayesian updates, where there are two observations. One updates with respect to $x_1$, and then uses the posterior of that as a prior to update with respect to $x_2$. In both cases, $x_1$ and $x_2$ are considered conditionally independent given the parameters (and identically distributed).

The other version updates straight from the start on both examples.

Case a will lead to a posterior $$p(\theta | x_1,x_2) = \frac{p(\theta)p(x_1 | \theta)}{p(x_1)p(x_2)} \times p(x_2 | \theta)$$

Case b will lead to a posterior $$p(\theta | x_1, x_2) = \frac{p(\theta)p(x_1 | \theta)p(x_2 | \theta)}{p(x_1,x_2)}$$

Integrate both sides of both posteriors, and you get 1. Therefore the integrals are equal (to 1). The numerator of the integrals is the same, therefore $$p(x_1,x_2) = p(x_1)p(x_2)!!$$

That seems to me very strange. Should $x_1$ and $x_2$ be independent even when considering their marginal version, and not conditioning on the parameters?

1 Answers1

9

This is one of the first results I give in my Bayesian Analysis class. You are confused by notations: using the same symbol $p$ all over is a reason for this confusion and hence let me introduce $\pi(\cdot)$ for the prior, $p_1(x_1|\theta)$ for the density of $X_1$, $p_{2|1}(x_2|\theta,x_1)$ for the conditional density of $X_2$ given $X_1=x_1$, and $p_{12}(x_1,x_2|\theta)$ for the joint density of $(X_1,X_2)$.

  1. In a sequential update of the information on $\theta$,$$\pi(\theta|x_1)= \frac{\pi(\theta)p_1(x_1 | \theta)}{m_1(x_1)}$$and the second update on $\theta$ is\begin{align*}\pi_{x_1}(\theta|x_2) &= \frac{\pi(\theta|x_1)p_{21}(x_2|\theta,x_1)}{m_{2|1}(x_2|x_1)} \\ &=\frac{\pi(\theta)p_1(x_1 | \theta)}{m_1(x_1)}\frac{p_{2|1}(x_2|\theta,x_1)}{m_{2|1}(x_2|x_1)}\\ &= \frac{\pi(\theta)p_1(x_1 | \theta)}{m_1(x_1)}\frac{p_{2|1}(x_2|\theta,x_1)}{\int \frac{\pi(\theta)p_1(x_1|\theta)}{m_1(x_1)}p_{2|1}(x_2|\theta,x_1)\text{d}\theta} \\&= \frac{\pi(\theta)p_1(x_1 | \theta)p_{2|1}(x_2|\theta,x_1)}{\int \pi(\theta)p_1(x_1|\theta) p_{2|1}(x_2|\theta,x_1)\text{d}\theta}\end{align*}
  2. In a joint update, the posterior of $\theta$ is $$\pi(\theta|x_1,x_2)=\frac{\pi(\theta)p_{12}(x_1,x_2|\theta)}{\int \pi(\theta)p_{12}(x_1,x_2|\theta)\text{d}\theta}=\frac{\pi(\theta)p_1(x_1 | \theta)p_{2|1}(x_2|\theta,x_1)}{\int \pi(\theta)p_1(x_1|\theta) p_{2|1}(x_2|\theta,x_1)\text{d}\theta}$$

Therefore both expressions are the same with no assumption on the dependence between both variables. The side result is that $$m_1(x_1)\times m_{2|1}(x_2|x_1)=m_{12}(x_1,x_2)$$

Xi'an
  • 105,342
  • 1
    thanks. I think you got a typo in the definition of $\pi(\theta | x_1,x_2)$, the second equality, in the integral, you have $x_1$ on both sides of the $|$. – bayesianlyconfused Jan 25 '15 at 22:53