Consider I have three random variables A, B, C. I know that A depends on (B,C). Can I always deduce that it implies that A depends on B and also A depends on C? I mean does it implies that neither A is independent of B or A is independent of C?
I tried to find my answer using the definition that if A and (B, C) are independent then $p(A| B, C)= p(A)$ therefore, $\frac{p(A\cap (B, C))}{p(B, C)} = p(A)$. Then, if I want to use proof by contradiction, I can suppose that A and B are independent. therefore, $p(A|B)= \frac{p(A \cap B)}{p(B)}=p(A)$. But I cannot continue my proof from this part, since for the equation of $\frac{p(A\cap (B, C))}{p(B, C)} = p(A)$ I need something about $p(B, C)$ not, A and B.
I also know that A and B and C are independent if they all mutually be independent, and A be independent of (B, C), B be independent of (A, C), and C be independent of (A, B).
Based on this, is this deduction correct:
If I know that A is independent of (B, C), we must have
A be independent of B, and
A be independent of C.
Therefore, if one of these two conditions doesn't hold, then A is never independent of (B, C) jointly. Thus,
when I know that A depends on (B, C), I may have A is independent of B.
Is this sentence correct?
I got confused. How is that possible? the fig (d) in this answer shows that in v structure, Z depends on (X, Y) and also Z depends on X and Z depends on Y. I got confused. How can I imagine somehow the same figure when Z depends on (X, Y) and also Z depends on X, but Z is independent of Y?
(Sorry if my notation is not precise in statistics)