I was reading about weight initialization in neural networks (He et. al, 2015) when I came across this statement:
"If we let $w_{l-1}$ have a symmetric distribution around zero and $b_{l} = 0$, then $y_{l-1}$ has zero mean and a symmetric distribution around zero."
Where $y_{l-1}$ is given by $w_{l-1}x_{l-1}$, the product of the layer's weights and its inputs. Here $w_{l-1}$ is generated by a normal distribution with mean zero, so it is symmetric. I know how to prove that the mean of this product is zero, but how would I go about showing it is also symmetric? In other words, given two random variables $X$ and $Y$ with $X$ being symmetric, how can I show $XY$ is also symmetric?