9

If $X$ and $Y$ are two random variables with probability density functions which are symmetric around their respective means, their sum, $X+Y$, has a probability density function which is symmetric around its mean as well.
Could someone offer a proof outline? Thanks.


Edit:
whuber's example (see comments) shows that the simply specifying symmetric marginals does not result in a symmetric sum
Dilip Sarwate's response gives two conditions, each of which is sufficient by itself: circular symmetry of the joint and independence of the marginals

Radu
  • 193
  • 4
    independent random variables ? – Stéphane Laurent Apr 24 '14 at 10:25
  • @StéphaneLaurent Not necessarily, but if this assumption simplifies the proof considerably, of course I would be interested to see it. – Radu Apr 24 '14 at 13:05
  • 3
    The result is not true in general. Let the values $(-1,1), (0,-1), (1,1)$ for $(X,Y)$ have probabilities $1/4, 1/2, 1/4$ respectively. Then both variables are symmetric around their means of $0$, yet $X+Y$, which takes only the values $0, -1,$ and $2$ with nonzero probability, cannot be symmetric. Independence is a necessary (but not sufficient) condition. – whuber Apr 24 '14 at 17:17
  • 1
    @whuber Why would independence be insufficient? – Radu Apr 24 '14 at 17:29
  • Sorry; I reversed the terms. Independence suffices to prove symmetry of the sum but it is not necessary: it is possible for the sum to be symmetric even when the variables are not independent. – whuber Apr 24 '14 at 17:34
  • Circular symmetry of the joint distribution is uninteresting in this situation, because it immediately implies the joint distribution is invariant under all reflections, including the reflection around the line $y=-x$, which alone suffices for the conclusion. A little more interesting is the fact that when the joint distribution is invariant under both the $x$ and $y$ reflections, then it will be invariant under a rotation through $\pi$, easily implying the sum is symmetric. These still are not necessary conditions, though. – whuber Apr 24 '14 at 18:35
  • 1
    @whuber To my mind, that the sum of dependent random variables with circular symmetry of the joint distribution has symmetric distribution is of interest and it is the case of independent random variables with circular symmetry of the joint distribution (which necessarily have normal marginal densities) that is uninteresting in the sense that it is the independence that is sufficing for the result re the sum. Yes, neither independence nor circular symmetry are necessary conditions (as the last paragraph of my edited answer shows), but they are sufficient conditions as the OP says. – Dilip Sarwate Apr 24 '14 at 21:07

2 Answers2

3

This looks a lot like a homework exercise but nonetheless, here goes.

If $X$ and $Y$ are zero-mean independent random variables, then (assuming that they are continuous random variables) we have that for any $z$, $f_{X+Y}(z)$ is given by the convolution of the marginal densities. Thus, $$\begin{align} f_{X+Y}(z) &= \int_{-\infty}^\infty f_X(z-y)f_Y(y)\,\mathrm dy \tag{1}\\ &= \int_{-\infty}^\infty f_X(y-z)f_Y(-y)\,\mathrm dy, &\text{symmetry of the densities}\\ &= \int_{-\infty}^\infty f_X(-t-z)f_Y(t)\,\mathrm dt, &\text{substitution: } t = -y\\ &= \int_{-\infty}^\infty f_X((-z)-t)f_Y(t)\,\mathrm dt,\\ &= \int_{-\infty}^\infty f_X((-z)-y)f_Y(y)\,\mathrm dy, &\text{substitution: } t = y \tag{2}\\ &= f_{X+Y}(-z) &\text{on comparing (1) and (2)}\tag{3} \end{align}$$ If $X$ and $Y$ have nonzero means $\mu_X$ and $\mu_Y$ respectively and their densities are symmetric about their respective means, then $\hat{X} = X-\mu_X$ and $\hat{Y} = Y - \mu_Y$ can be used in the above proof to show that $\hat{Z} = \hat{X} + \hat{Y} = (X+Y) - (\mu_X+\mu_Y) = Z - \mu_Z$ has a density symmetric about $0$, and so $Z$ has a density symmetric about $\mu_Z$. Or, we can use the outline suggested in @Quantibex's answer to incorporate the means in the proof itself.

Similar proofs can be written for discrete random variables.

While the result is always true for independent random variables, it can hold for some dependent random variables as well. As an example, see this recently-closed question where it is shown that if $(X,Y)$ are uniformly distributed on the unit disc (and hence have symmetric marginal densities but are not independent), then $X+Y$ also has a symmetric density; in fact, $$f_{X+Y}(z) = \frac{1}{\sqrt{2}}f_X\left(\frac{z}{\sqrt{2}}\right). \tag{4}$$ Indeed, $(4)$ is true whenever $(X,Y)$ have a circularly symmetric joint density (uniform density, as in the closed question, is not needed). Another nice example (with nonzero means) is the joint density that has value $2$ on the trapezoidal region with vertices $(0,0), (1,1), (\frac 12, 1), (0,\frac 12)$ and on the triangular region with vertices $(\frac 12,0), (1,0), (1,\frac 12)$. It is readily verified that $X$ and $Y$ have marginal densities $U(0,1)$ that are symmetric about their mean $\frac 12$, and that they are not independent. Nonetheless, the density of their sum is the convolution of the marginal densities and is symmetric about $1$.

Dilip Sarwate
  • 46,658
2

An outline of a proof (in the case where $X$ and $Y$ are independent) is the following.

Denote $f_X$ and $f_Y$ the density functions of $X$ and $Y$, and $\mu_X$ and $\mu_Y$ their respective means. Note that $f_X(\mu_X + x) = f_X(\mu_X - x)$ for all $x$ by the symmetry of $f_X$, and similarly $f_Y(\mu_Y + y) = f_Y(\mu_Y - y)$ for all $y$.

Let $Z = X + Y$, and denote $f_Z$ its density function and $\mu_Z$ its mean. Obviously, $\mu_Z = \mu_X + \mu_Y$.

It can be shown that $f_Z$ is the convolution $f_X * f_Y$ of $f_X$ and $f_Y$, where $$ (f_X * f_Y)(z) = \int_{-\infty}^\infty f_X(z - y)f_Y(y) dz . $$

To prove the symmetry of $f_Z$, show that $f_Z(\mu_Z + z) = f_Z(\mu_Z - z)$ for all $z$ using the convolution $f_X * f_Y$, with an appropriate change of variable such as $y = t + \mu_Y$, and using the symmetry properties of $f_X$ and $f_Y$.

QuantIbex
  • 4,086