Suppose there are two CDF's $F$ and $G$ whose common support is $[0,1]$, and pdf $f$ and $g$, respectively. If distances are measured in squared differences. What kind of condition would guarantee the following inequality? $$\int^1_0(x-y)^2f(x)dx\leq\int^1_0(x-y)^2g(x)dx,~\forall y\in[0,1].$$
Asked
Active
Viewed 34 times
1 Answers
0
I don't think that there can be a general solution to this question, because it is much to brought. Nevertheless, let us consider which distribution would maximize the integral:
- For $y\lt 0.5$ the $\delta$-distribution $\delta(1)$ maximizes the integral.
- For $y= 0.5$ both $\delta$-distributions $\delta(0)$ and $\delta(1)$ maximize the integral.
- For $y \gt 0.5$ the $\delta$-distribution $\delta(0)$ maximizes the integral.
This arguments shows, that skewed distribution will "in principle" yield larger integrals then symmetric distribution (not a mathematical statement). However, this is only true for "half" of the $y$-intervall. Hence, we should demand, that both functions $f(x)$ and $g(x)$ be symmetric with respect to $x=0.5$.
The above argument shows as well, that $g(x)$ should have two maxima, which lie at $x=0$ and $x=1$, while $f(x)$ should have only one maximum at $x=0.5$. Furthermore,
- I would expect that if $f(x)$ and $g(x)$ have the same "functional form", than the statement is true. By "functional form" I mean if $f(x) = Norm(0.5, \sigma^2)$ then $$g(x) = \big\{\big. \begin{matrix} Norm(0,\sigma^2) \; \textrm{, if $x\le 0.5$} \\ Norm(1,\sigma^2)\; \textrm{, if $x\gt 0.5$} \end{matrix} $$ Of course the normal distribution does not integrate to one on $x\in [0,1]$, so we would need to include a normalization factor.
- Furthermore, I would expect that $g(x)$ "likes" to have a small variance. Hence, we obtain $\delta$-distributions in the limit $\sigma^2 \to 0$.
If have not performed any calculation to verify this intuition.
Semoi
- 731