1

Suppose $X_1\sim U[a,b]$ and $X_2\sim U[c,d]$ with $a<b<c<d$ and suppose they are independent. I guess that the sum must be a uniform but I don't know how to show it [EDIT: I was wrong].

I would like to show it using convolution (although I'm curious for other solutions as well). I have studied the proof of https://math.stackexchange.com/questions/220201/sum-of-two-uniform-random-variables but I could not adapt it to my case.

Xu Wang
  • 104
  • 3
    You should make it explicit whether the variables are independent. From your mention of convolution I will assume they are. The shape of any convolution is unaffected by shifts in the arguments (so "non-overlapping" makes no difference). – Glen_b Nov 02 '16 at 23:03
  • 3
    When you say "I know that the sum must be a uniform (just from guessing)" ... are all (or essentially all) of your guesses true? In several decades of statistics, I've found guessing to be a very unreliable means of making progress (and yet from observation of the guesses people make I expect my intuition is at least slightly better than average; this may be partly why I ended up being a statistician). It makes me wonder what you think the word "know" means; it doesn't mean the same thing as "hope" for example. – Glen_b Nov 02 '16 at 23:03
  • 1
    There's an informal solution outlined here that should help you to set up the relevant integrals (in particular, the limits for the various pieces). Any diagonal line in the first image indicates the interval for the integral for a given value of the sum. – Glen_b Nov 02 '16 at 23:07
  • @Glen_b thank you for helpful feedback to improve the question! You are right and my usage of "know" was incorrect. I wanted to try to tell you that I spent time thinking about it and that even if answer if "uniform" I'm interested in showing it (not in final answer). I actually find guessing and thinking about problem a lot very useful before working through the math. It is a way that I build up intuition. Otherwise if I only use math sometimes I don't build up intuition and I just know how solving integrals. I will study the answer you directed me. Thank you for your time Glen_b! – Xu Wang Nov 03 '16 at 02:10
  • 3
    Perhaps it might help your intuition to think about simpler discrete distributions. Say we have a die taking the values ${1, 2, \ldots , 6 }$ and another "negative die" taking the values ${-1, -2, \ldots , -6 }$. Now think about about how we can obtain $0$ as the sum compared to say $5$. Do you believe the sum is uniformly distributed? Would this change if we were working with continuous distributions? – dsaxton Nov 03 '16 at 02:26
  • @dsaxton ah yes that is very helpful! Thank you for this example. This really helps with intuition for me. – Xu Wang Nov 03 '16 at 03:16
  • 2
    @Xu I won't deny that informed guessing can be useful, but you must be able to apply some critical thinking to it for it to have much value in general. Practice at the formalities will help improve your intuition (not least because you'll come to realize how dangerous untutored intuition can be in statistics and learn to be wary of it without some additional effort). A good second step to an initial intuition is attempting to show why it must be wrong. – Glen_b Nov 03 '16 at 04:16
  • 1
    In cases where you don't have solid intuition, learn how to simulate. It can be valuable for building up insights (as long as you understand the tool -- some people fail to appreciate the need to gauge the variability in the results for example and tend to use inadequate sample sizes). If you have access to a package (R is free!) you can simulate and draw (say a histogram of) many thousands of such sums of variates in moments. I would tend to use simulation on average several times a day to gain a bit of insight into whatever issues I am working on or thinking about. – Glen_b Nov 03 '16 at 04:20
  • @Glen_b thank you for such insights. I see now a better path to learning intuition. I did not even consider intuition. I already know (this time I use correctly the word I believe :)) R and love it, so I should have thought of trying that. Actually, I will still do that I think. I have learned a lot about how to approach a problem from your advices. Thank you for the teachings. – Xu Wang Nov 03 '16 at 14:24

1 Answers1

7

I guess that the sum must be a uniform

It isn't. The middling values of the sum occur in "more" ways than the sums near the ends of the range.

I would like to show it using convolution

Yes, you just write the convolution integral.

$ h(z)=(f*g)(z)=\int _{-\infty }^{\infty }f(z-t)g(t)dt=\int _{-\infty }^{\infty }f(t)g(z-t)dt$

Note that the $f$ and $g$ in the integral are both constant where they're non-zero, and so $f(z-t)g(t)$ is constant on an interval .... what drives the value of the integral at a specific value of the sum is the limits of where it's non-zero.

However, the integral is best split into three ranges because of the corners in the joint distribution (see the left most diagram here for the case where the lower limits are both 0; your case works similarly to that and you should use a similar diagram).

(although I'm curious for other solutions as well).

IN the case of problems very similar to this one, the approach at the link I gave earlier uses simple arguments to derive the distribution for a simpler case than yours, but then suggests how to generalize it to a case very like yours.

More generally there's a number of approaches related to using generating functions or Laplace transforms or Fourier transforms. They are all closely related. Sometimes they can make a problem easy.

  • Moment generating function: $M_X(t) = E(e^{tX})$
  • Laplace transform: $\mathcal{L}_X(t) = E(e^{-tX})$
  • Characteristic function: $\phi_X(t) = E(e^{itX})$
  • Fourier transform: $\mathcal{F}_X(t) = E(e^{-itX})$

There are several possible definitions (see toward the end of the linked section) of the Fourier transform in use. The above one is one of the forms listed at the linked section (chosen for the fact that it fits into this scheme of writing them all as very simple expectations); all the other forms of the Fourier transform work equally well, it's just a matter of consistently using whichever you like.

There are some other generating functions/transforms we could include in this list. In the ones above, aside from a change of sign or the inclusion of the constant $i$, they're all closely related; an argument involving one will have an explicit argument involving another. Note, however that the last two have a slight advantage, in that they always exist in a neighborhood of 0, which the first two might not (making them not useful on some problems).

They all have the property that the transformed product of a convolution is the product of the transforms. e.g. for $X$, $Y$ independent (so we have a convolution) $M_{X+Y}(t)=M_X(t)\,M_Y(t)$.

There are two main ways to use them.

  1. Consider moment generating functions, $M_X(t)=E(e^{tX})$. One way to proceed is to proceed to calculate $M_X(t)\,M_Y(t)$ and simply recognize the result (perhaps after some simplification) as the MGF of a density, in much the same way that an applied mathematician, physicist, or engineer might use a table of Laplace transforms to solve a differential equation. In many simple problems (and a few not so simple ones) this will lead to a solution.

  2. Explicit inversion. Consider say a Fourier transform. One computes $\mathcal{F}_X(t)\,\mathcal{F}_Y(t)$ and then uses the corresponding inverse Fourier transform (which depends on your units) to transform the result back to a density.

Glen_b
  • 282,281