1

I have the following question. At first this seemed very silly but after thinking about it, I found my self struggling.

Given $X$, a random variable, I should decide if the following sentences are right or wrong and explain why:

1 - $X$ and $X^2$ are always independent.

2 - $X$ and $X^2$ are never independent.

3 - $X$ and $X^2$ are always correlated.

4 - $X$ and $X^2$ are never correlated.

Here are my answers. I was able (I think) to answer dependency part, but the correlation part I couldn't.

1 - False: since $X^2$ could only be achieved with $X$ and $-X$, it is dependat with $X$.

2 - False: As said before, we can get $X^2$ using $-X$. This way $X^2$ and $X$ are independant.

3 & 4 I don't have a clue.

Could you guys help me with it? Did I answer questions 1 and 2 right? any explenation by words or via mathemtical approach would help.

Thank you!

CORy
  • 543
  • 1
    What definition of correlation is being used? The default is usually Pearson correlation unless another form is specified, but the definition will matter. // Possible/likely duplicate – Dave Feb 12 '23 at 15:59
  • 2
    Please add the self-study tag and read its tag-wiki info https://stats.stackexchange.com/tags/self-study/info. I couldn't follow your reasoning for 2 nor how that leads you to the answer you gave, maybe you should clarify that. For "always/never" type statements, identifying simple counterexamples is often a useful strategy. – Glen_b Feb 12 '23 at 16:07
  • 1
    If you want to say "false" for each of these then you can show particular distributions for $X$ as counterexamples. There are also examples such as $X=\pm1$ with equal probability where correlation is not defined – Henry Feb 12 '23 at 16:11
  • For (3) and (4), a variant of @Henry's suggestion is worth considering: let $X$ have a uniform distribution on ${-1,0,1}.$ What is the correlation coefficient of $(X,X^2)$? – whuber Feb 12 '23 at 16:42
  • think of $X\sim Uniform[0,0.1]$ then try to calculate the correlation coefficient vs $X\sim Uniform[-0.1,0.1]$ – Aksakal Feb 12 '23 at 17:07
  • @Aksakal I didn't get it.. I generated random numbers as from two Uniform distributions, now what? – CORy Feb 12 '23 at 19:53
  • @Glen_b Providing a counterexamples is what I tried doing here, as I added the $X = -1$ as an example.. – CORy Feb 12 '23 at 19:54
  • You also have to know some basic facts about correlation, covariance and expectation. Consider any distribution for $X$ that's symmetric about 0; look at the contributions to $E(X,X^2)$ from the positive and negative half and hence write down the covariance; then correlation from that. Then consider a different example not symmetric about $0$ (the chances that you'll hit something on the second case that isn't a counterexample is quite small, but if it happens, try another). – Glen_b Feb 12 '23 at 20:57

2 Answers2

1

Just to spell it out in more detail than the comments, here's one counterexample for 3. Let $ X \sim U[-1,1] $ and let $ Y = X^2 $. Then $$ \begin{align*} \mathbb{C}\text{ov}(X,Y) &= \mathbb{E}[X^3] \\ &= \int_{-1}^1 \frac{x^3}{2} dx \\ &= 0. \end{align*} $$ For question 1, you can basically use any example where squaring would lead to a constant. Remember that a constant is independent of all random variables, including itself.

1

Since this is a problem, I will only give some helpful hints and leave it to the OP to fill in the gaps (how to do the math).

  1. $X$ and $X^2$ need not be independent. Consider the image below comparing $X\sim U(0,1)$ with $X^2$. The pattern shows clear structure.

enter image description here

set.seed(2023)
N <- 1000
x <- runif(N, 0, 1)
plot(x, x^2)
  1. However, $X$ and $X^2$ can be independent, such as if $X$ is a degenerate distribution with all probability mass on one value.

  2. $X$ and $X^2$ do not have to be correlated, as the image below shows for a random variable $X$ with uniform probability mass on $-1$, $0$, and $1$.

enter image description here

set.seed(2023)
N <- 1000
x <- c(-1, 0, 1)
plot(x, x^2)
  1. However, $X$ and $X^2$ can be correlated, as the first example shows.

You have the answers. Now it is up to you to prove my claims using the math instead of just getting intuition from pictures. For the independence claims, rely on the definition of independence, $P(A\cap B) = P(A)P(B)$ for all events $A$ and $B$, and either find $A$ and $B$ such that the equation fails (to show dependence) or show that the equation holds for all events $A$ and $B$. For the correlation claims, rely on the definition of covariance as $\mathbb E\left[ (X-\mathbb E\left[X\right])(Y-\mathbb E\left[Y\right) \right]$, knowing that for distributions with positive variance like I have given in 3 and 4, nonzero covariance corresponds to nonzero correlation.

Dave
  • 62,186
  • Thank you so much! this sure helps alot. So you're saying all of those statments are false, yes? – CORy Feb 12 '23 at 21:41
  • @CORy I have a counterexample to each statement, yes. Now can you prove them? – Dave Feb 12 '23 at 21:49
  • I think so.. I'm having a hard time only with statment 4 that they are never correlated. The rest I got them right, I think. – CORy Feb 12 '23 at 22:13
  • I have a question about 2. The statement says $X$ and $X^2$ are never independent. You said to show this we can use the degenerate distribution, but in this case $X$ and $X^2$ would be exactly the same and they would be dependent, right? – CORy Feb 26 '23 at 12:55
  • @CORy I suppose it depends on how independence is defined. If you go with what I gave in the last paragraph, then, amazingly, $X$ and $X^2$ wind up being independent! – Dave Feb 26 '23 at 15:08
  • Could you give me a counter example for the second statement? I stuggle to find any good explenation. – CORy Feb 26 '23 at 20:42
  • @CORy Just work through $P(A\cap B)=P(A)P(B)$. Maybe work with a Bernoulli-distributed random variable $X$ to simplify your calculations, and make that $X$ a degenerate Bernoulli (so $p=0$ or $p=1$). – Dave Feb 26 '23 at 20:52