3

I know that if $A$, and $B$ are independent, the independence is preserved for $A^c$, and $B^c$, where $c$ is a constant. I am wondering if the same applies to the case where the random variables are dependent.

I have been trying to check the relationship between two random variables $A$, and $B$. Using MATLAB and generating millions of samples, I get $\text{cov}(A,B) = 0$, which tells me that $A$, and $B$ could be either "independent" or "dependent with non-linear relation". Then I decided to find $\text{cov}(A^2,B^2)$, and it is not 0.

I know that if $\text{cov}(A^2,B^2) \ne 0$, then, $A^2$, and $B^2$ are dependent, and I am guessing that the dependency is preserved for $\sqrt{A^2}$, and $\sqrt{B^2}$.

  1. Is my guess correct? I suppose taking the square root could not make them independent.

  2. If my guess is correct, would the interpretation that the square root function is transforming a linear dependency into a non-linear dependency be correct too? The rationale would be that $\text{cov}(A^2,B^2) \ne 0$, induces a linear dependency, and $\text{cov}(A,B) = 0$, induces a non-linear dependency.

Alexis
  • 29,850
Roberto
  • 31
  • 5
    If you know that $A, B$ independent implies $A^2, B^2$ independent, then the contrapositive of this statement is $A^2, B^2$ dependent implies $A, B$ dependent. – mrepic1123 Jan 20 '24 at 17:50
  • 2
    Let $\rho(X,Y)=0$ when $(X,Y)$ are uncorrelated and otherwise let $\rho(X,Y)=1.$ By graphing the values you should have little difficulty constructing simple, small datasets with all four possible combinations of $\rho(A,B)$ and $\rho(A^2,B^2).$ Thus there is no universal deduction you can make about one of $\rho(A,B)$ and $\rho(A^2,B^2)$ given the other. – whuber Jan 20 '24 at 18:35
  • 1
    In light of the comment by @mrepic1123, https://stats.stackexchange.com/questions/94872 answers the question in your title. But the meaning you attach to "dependent" is unclear: do you really mean in the usual sense of random variables or are you focusing on some measure of linearity? – whuber Jan 21 '24 at 00:15
  • @mrepic1123 Thank you for your answer. But could you point me anywhere that could back up this claim? I tend to believe it is correct, but proofs I have seen always deal with independent RVs only. Just like the link provided by whuber, it proves that if A,B are indepeendent, then, f(A) and g(B) are also independent. No mentioning to the dependent case. – Roberto Jan 21 '24 at 17:07
  • @whuber Thanks a lot for the comment. I mean dependency in the usual sense of RVs. Dependency between two RVs can be either linear or non-linear. Only the linear type is caught by the Peterson correlation coefficient. – Roberto Jan 21 '24 at 17:14
  • 1
    The usual meaning of independence of random variables has nothing to do with linearity, which is why it is so strange to see your linear-nonlinear distinction arise in that context. – whuber Jan 21 '24 at 17:23
  • Let me clarify. I mean linear or non-linear for dependency (non-independency) only. Two RVs could be related by some linear or non-linear relation. Sorry, but I didn't get if you agree with my guess on the original post. From your 1st comment I believe you disagree. But your 2nd comment provides a link that says if A,B are independent you will not be able to construct datasets that make them dependent by applying some function to A and B, which seems to go against your 1st comment. Could you clarify if you agree that if two RVs are dependent, taking sqrt will not make them independent? – Roberto Jan 21 '24 at 17:43
  • 1
    @Roberto are you asking me for a proof of the contrapositive? – mrepic1123 Jan 22 '24 at 18:18
  • @mrepic1123 Thank you. I got your original explanation! – Roberto Jan 25 '24 at 05:44

1 Answers1

0

You can use contraposition: From $X \rightarrow Y$ it follows that $\neg Y \rightarrow \neg X$

So if it is given that

$$A,B \text{ independent} \rightarrow A^2,B^2 \text{ independent}$$

Then

$$A^2,B^2 \text{ dependent} \rightarrow A,B \text{ dependent}$$

This is an indirect proof by contradiction. I am not sure whether a more direct proof is possible. The square is a many to one transformation (non-injective) and I guess that we always need to be using an indirect proof.


A related case is not always true

$$A,B \text{ dependent} \rightarrow A^2,B^2 \text{ dependent}$$

Counter-example let $A \sim N(0,1)$ and $C \sim N(0,1)$ be independent normal distributed variables and define $B := |C| \cdot \text{sign}(A)$ then $A$ and $B$ are dependent, but $A^2$ and $B^2$ are not.