2

I'm doing some homework for self-study in probalistic graphical models, and this question has me stumped.

I'm pretty sure the answer is no. But I don't know how to prove it.

So far, I have:

(1) A ⊥ B | C ==> P(A, B | C) = P(A | C) P(B | C)
(2) A ⊥ C | B ==> P(A, C | B) = P(A | B) P(C | B)
(3) P(A, B, C) = P(A, B | C) P(C)
(4) P(A, B, C) = P(A, C | B) P(B)
(5) combining (1) and (3), P(A, B, C) = P(A | C) P(B | C) P(C) = P(A | C) P(B , C)
(6) combining (2) and (4), P(A, B, C) = P(A | B) P(C | B) P(B) = P(A | B) P(B , C)
(7) from (5) and (6), P(A | C) = P(A | B)

Given the last statement, it seems that in order for A ⊥ B and A ⊥ C, it must be true that either (a) everything is independent or (b) B = C. I'm having a hard time coming up with a proof, though. Not sure what kind of counterexample I could use.

2 Answers2

2

A ⊥ B | C and A ⊥ C | B, is A ⊥ B and A ⊥ C?

No, for example, let $B=C$. Conditional statements hold, but $A$ and $B$ doesn't have to be independent in the first place.

it seems that in order for A ⊥ B and A ⊥ C, it must be true that either (a) everything is independent or (b) B = C

No, for example, B and C can be different events, every pair of events can be pairwise independent but $A,B,C$ not mutually independent (which is what I understood from your 'everything is independent' idea).

gunes
  • 57,205
0

We have probabilities for 8 different situations

$$\begin{array}{rrrr} P(& A,& B,& C) \\ P(&!A,& B,& C) \\ P(& A,&!B,& C) \\ P(&!A,&!B,& C) \\ P(& A,& B,&!C) \\ P(&!A,& B,&!C) \\ P(& A,&!B,&!C) \\ P(&!A,&!B,&!C) \\ \end{array}$$

Given the information there are several equations that allow to restrict the values.

With the conditional independence there are four equations that restrict these values

$$A \perp B | C \\ \begin{array}{rcl} \frac{P(A,!B,!C)}{P(!A,!B,!C)} = \frac{P(A,B,!C)}{P(!A,B,!C)}\\ \frac{P(A,!B,C)}{P(!A,!B,C)} = \frac{P(A,B,C)}{P(!A,B,C)}\\ \end{array} \\ A \perp C | B \\ \begin{array}{rcl} \frac{P(A,B,C)}{P(!A,B,C)} = \frac{P(A,B,!C)}{P(!A,B,!C)}\\ \frac{P(A,!B,C)}{P(!A,!B,C)} = \frac{P(A,!B,!C)}{P(!A,!B,!C)}\\ \end{array} \\ $$

And because of some repetitions we can combine them

$$\frac{P(A,B,C)}{P(!A,B,C)} = \frac{P(A,!B,C)}{P(!A,!B,C)} = \frac{P(A,B,!C)}{P(!A,B,!C)} = \frac{P(A,!B,!C)}{P(!A,!B,!C)}$$

Which means that there are effectively 3 equations for the dependence but it straps all odds ratios for the events $A$ and $!A$, which are the same independent of all four possible conditions of $B$ and $C$ and therefore $A$ is independent of $B$ and independent of $C$.

Thus, A ⊥ B | C and A ⊥ C | B implies A ⊥ B and A ⊥ C.

More strictly we have

Thus, A ⊥ B | C and A ⊥ C | B if and only if A ⊥ B and A ⊥ C.


Given the condition "A ⊥ B | C and A ⊥ C | B" we can express all 8 values in terms of only 4 values.

We use $P(B)$, $P(C)$, $P(B,C)$ and $P(A)$ to express all values.

With the additional derived values $$\begin{array}{} P(!B,!C) &=& 1-P(B)-P(C)+P(B,C)\\ P(B,!C) &=& P(B) - P(B,C)\\ P(!B,C) &=& P(C) - P(B,C)\\ P(!A) &=& 1 - P(A)\end{array}$$

you get the following

$$\begin{array}{rrrrl} P(& A,& B,& C) & = &P(A) \cdot P(B,C)\\ P(&!A,& B,& C) & =& P(!A) \cdot P(B,C)\\ P(& A,&!B,& C) & = &P(A) \cdot P(!B,C)\\ P(&!A,&!B,& C) & =& P(!A) \cdot P(!B,C)\\\ P(& A,& B,&!C) & =& P(A) \cdot P(B,!C)\\ P(&!A,& B,&!C) & =& P(!A) \cdot P(B,!C)\\ P(& A,&!B,&!C) & =& P(A) \cdot P(!B,!C)\\ P(&!A,&!B,&!C) & =& P(!A) \cdot P(!B,!C)\\ \end{array}$$


it seems that in order for A ⊥ B and A ⊥ C, it must be true that either (a) everything is independent or (b) B = C.

No, based on the above, we only need 'A ⊥ B and A ⊥ C'; the variables B and C can be independent and do not need to be B = C.