Let $C$, $B$, and $A$ be events in the same probability space, such that $A$ and $B$ are independent and $P(A \cap C) > 0$, $P(B \cap C) > 0$.
Prove or disprove:
$P(A \cap B|C) = P(A|C)P(B|C).$
Let $C$, $B$, and $A$ be events in the same probability space, such that $A$ and $B$ are independent and $P(A \cap C) > 0$, $P(B \cap C) > 0$.
Prove or disprove:
$P(A \cap B|C) = P(A|C)P(B|C).$
No this is not in general true, as you can see from a simple counter example:
Toss two independent coins.
Event $A$ is coin 1 head. $P(A)=0.5$
Event $B$ is coin 2 head. $P(B)=0.5$
Event $C$ is either coin heads. So $C=A\cup B$.
$P(A \cap B \mid C) = 1/3$
$P(A \mid C) = 2/3$
$P(B \mid C) = 2/3$
So $P(A \cap B \mid C) \neq P(A\mid C) \times P(B\mid C)$
This example is related to the idea of Berkson's bias. If you have two independent events, then conditioning on a third event that depends on both can induce a correlation between them. (See Berkson's paradox.)
Here's a drawing if it makes it easier to follow. $A$ is the blue area, $B$ is the red area, $C$ is the total shaded area. $A$ and $B$ are independent, but if we condition on $C$ (being in the shaded area), then $A$ and $B$ are no longer independent because if we know $A$ is false then $B$ must be true.
A related question is If $X, Y$ are independent of $Z$, is $P(X|Y, Z) = P(X|Y)$?
The example there, $C = XOR(A,B)$, is a simple counter example to this question as well
A B C probability
0 1 1 1/4
1 0 1 1/4
0 0 0 1/4
1 1 0 1/4
Then
$$P(A=1 \text{ and } B=1|C=1) = 0$$
whereas
$$P(A=1|C=1) \cdot P(B=1|C=1) = 0.5 \cdot 0.5 = 0.25$$
Also related: