1

Suppose that $W_1$ and $W_2$ are i.i.d. and $P(W_i>x)=x^{-1/2}$ and $x$ is greater than or equal to $1$ and $i=1,2.$ How do you show that $P(W_1+W_2>x)=(2\sqrt{x-1})/x$? I know it involves calculus either differentiation or integration but I am not sure how to go about it. Any ideas?

Also, how do I use this above information for W to show that the value at risk (VAR alpha) is super-additive for all alpha values within 0 and 1 meaning that How do I show that VAR alpha(W1+W2)>VAR alpha(W1)+VAR alpha(W2). Any ideas?

ikolo
  • 11
  • 1
  • Add the self study tag. – Michael R. Chernick Feb 10 '19 at 03:15
  • 1
    It seems you haven't started by searching (say on sum of random variables); there are numerous posts that discuss the procedure with independent variates (generally by convolution -- e.g. see this pdf). For example, see the following posts on our site ... – Glen_b Feb 10 '19 at 06:03
  • 1
  • https://stats.stackexchange.com/questions/331973/why-is-the-sum-of-two-random-variables-a-convolution 2. https://stats.stackexchange.com/questions/106410/pdf-of-a-sum-using-convolution 3. https://stats.stackexchange.com/questions/115198/sum-of-random-variables
  • – Glen_b Feb 10 '19 at 06:06
  • 1
    If after reading the links (the last link in my first comment has numerous examples) you have a specific question, please post a new question. – Glen_b Feb 10 '19 at 06:15
  • 1
    Also note that you can use the fact that $F_{W_{i}}(w_i)=1-P(W_i\le w_i)$ and from here, pdfs can be worked out as well as joint pdfs given independence. – StatsStudent Feb 10 '19 at 06:16