2

For any distribution, we can substract two random variables and find the distribution of the difference. But what about the reverse? Can the below statement be true in general or under any condition for the distribution?

For any probability distribution function, there exists another distribution such that the difference of two independent random variables from the second distribution has the first distribution

  • 5
    For a counterexample see https://stats.stackexchange.com/questions/346269/uniform-pdf-of-the-difference-of-two-r-v So the answer is NO. Maybe ask for examples of distributions where this is possible? – kjetil b halvorsen Mar 08 '23 at 17:57
  • Are you assuming the RVs you are subtracting are independent or not? – whuber Mar 08 '23 at 18:01
  • @kjetilbhalvorsen e.g. Logit distribution. the difference of two independent random variables from Gumbel is Logit. – Mohammad Mar 08 '23 at 18:04
  • @whuber yes, I assume independent and added it into the question – Mohammad Mar 08 '23 at 18:05
  • 1
    Then consider at the question in the other direction: given any distribution family, what family of distributions is generated by looking at differences in that family? That gives you a full characterization of the distributions you can start with. This concept is extremely close to that of an indecomposable distribution, because the difference of two independent random variables is the sum of one with the negative of the other. – whuber Mar 08 '23 at 19:53

1 Answers1

4

This is not the case if the two variables from the second distribution are independent.

For example, the uniform distribution over $[-1,1]$ cannot be expressed as the difference of two i.i.d. random variables.

To see this, consider the characteristic function of $Z\sim U[-1,1]$. $$\varphi_Z(\theta) = \frac{\sin \theta}{\theta}$$ If $X$ and $Y$ are i.i.d. then $$\varphi_{X-Y}(\theta)=\varphi_X(\theta)\varphi_Y(-\theta)=\varphi_X(\theta)\overline{\varphi_Y(\theta)}=\Vert\varphi_X(\theta)\Vert^2$$

However, then $$\frac{\sin \theta}{\theta} =\Vert\varphi_X(\theta)\Vert^2$$ which is a contradiction since the LHS is sometimes negative whilst the RHS is nonnegative.

This example is exercise E16.1 from William's Probability with Martingales.

User1865345
  • 8,202
  • Great, can one think of a counter example of a continuous and differentiable distribution too? – Mohammad Mar 08 '23 at 18:07
  • @Mohammad The distribution function of $Z$ is already continuous and differentiable. What is the point of asking "a counterexample of a continuous and differentiable distribution"? – Zhanxiong Mar 08 '23 at 18:31
  • I mean continuous an differentiable pdf over the entire real line. For cdf you are right about continuity but it's not differentiable at -1 and +1 – Mohammad Mar 08 '23 at 19:16
  • 1
    The pdf of $Z$, which is $0.5I_{(-1, 1)}(x)$, is also continuous and differentiable everywhere over the real line except on $\pm 1$. In measure-theoretic probability theory, there is no essential difference between continuous/differentiable everywhere and continuous/differentiable almost everywhere. – Zhanxiong Mar 08 '23 at 20:03
  • So can you think of an everywhere and not almost everywhere differentiable distribution that we can show the above? – Mohammad Mar 09 '23 at 02:43
  • I'm struggling to come up with an example. The easiest would be a distribution which is symmetric about 0, has a continuous and differentiable pdf over R, and has a CF which is at some point negative. Then we could use exactly the same argument above, but I can't think of one. – Joseph Basford Mar 09 '23 at 07:12