3

Consider the Irwin-Hall distribution, which for me means $X_n$ is the sum of $n$ uniform iid variables in $[-1/2, 1/2]$. I would like an absolute lower bound for $E[|X_n|]$.

By general theory we know that it approaches an explicit constant times $\sqrt{n}$, but I would like a statement of the form: for all $n > N$, we have $E[|X_n|] > c\sqrt{n}$ for explicit $c$ and $N$. I am aware that one way to achieve such a statement is from Holder's inequality, applied to say the first and fourth moments and comparing with the second moment.

However, this may be a bit weak, and the $N$ we get may be quite large. Ideally I would want a statement that is true for $N$ relatively small (e.g. 1-digit would be great), and maybe a better bound if possible. Of course I can explicitly compute using a program what it is for small $N$. So one strategy that might work for what I need is a statement that the actual $c$ one gets for $n$ gets closer to the limiting value of $c$ after a certain point, and check the values by hand for all $n$ up to that value.

Edit: to clarify, and in light of Henry's answer, it seems that these $c$-values decrease with $N$, so that the limiting value is indeed an absolute lower bound. So I am really looking for a proof that these $c$-values do indeed decrease.

In some other contexts it seems that some results have been proven regarding monotonic convergence towards the central limit theorem, which is the flavor of result I need, but I haven't found anything that directly implies it.

  • 1
    I think the Irwin-Hall distribution is more usually the sum of $n$ iid uniform $[0,1]$ random variables, and to get to your $X_n$ requires subtraction of $\frac n2$, but this location issue is not particularly important here. – Henry Dec 14 '23 at 01:58
  • Using the formula for the PDF given at the end of my post at https://stats.stackexchange.com/a/43075/919 (and doing half the integral and doubling the answer) you can work out the values explicitly as a sum of $\lceil (n+1)/2 \rceil$ terms. The sum doesn't simplify to any recognizable analytic or closed-form function. – whuber Dec 14 '23 at 02:46

1 Answers1

6

I am not quite clear what you are asking for.

For $n=1$ you get $\mathbb E[|X_1|]=\frac14=0.25\times \sqrt{1}$.

For $n=2$ you get $\mathbb E[|X_2|]=\frac13\approx 0.2357\times \sqrt{2}$.

The expectation of $X_n$ is $0$ and the variance of $X_n$ is $\frac{n}{12}$ with the central limit theorem applying to $X_n$, while the expectation of a half-normal distribution is $\sqrt{\frac{2 }{\pi}}\sigma$, so I would expect your limiting value $c$ to be $\sqrt{\frac{1}{6\pi}}$ and for this to be the lower bound, i.e. for all $n$

$$\mathbb E[|X_n|] \ge \sqrt{\frac{n}{6\pi}} \approx 0.2303 \times \sqrt{n}.$$

Henry
  • 39,459
  • 2
    +1. The sequence $1/4,1/3$ continues $\frac{13}{32},\frac{7}{15},\frac{1199}{2304},\frac{239}{420},\frac{113149}{184320},\frac{1487}{2268},\frac{14345663}{20643840},\frac{292223}{399168},\frac{17110600987}{22295347200},\frac{14849671}{18532800},\frac{545242142639}{653996851200},\frac{961780559}{1111968000},$ $\frac{1704615588759647}{1904438830694400},\frac{856088316689}{926269344000},\frac{7836371329207844977}{8227175748599808000},\frac{1103759659545457}{1126343522304000},\frac{16895087931630048788047}{16783438527143608320000},\frac{59954362566895631}{58057888831488000},\ldots$ – whuber Dec 14 '23 at 02:43
  • 1
    I agree with the first couple of additional figures (which I had done by hand). That sequence whuber gave divided by $\sqrt{n}$ decreases (each term is smaller than the one before; all remaining above Henry's $\sqrt{\frac{1}{6\pi}}$). Perhaps it's possible that $E[|X_n|]/\sqrt{n}>E[|X_{n+1}|]/\sqrt{n+1}$ could be proven, which would establish the result that $0.2303... \times\sqrt{n}$ is a lower bound. – Glen_b Dec 14 '23 at 03:00
  • 1
    For what it's worth, an obvious upper bound is $\sqrt{\frac{n}{12}}\approx 0.2887\sqrt{n}$, but we can see that $0.25\sqrt{n}$ would appear to be sufficient. – Glen_b Dec 14 '23 at 03:14
  • 1
    Thanks, this is almost what I was after, but is there a proof that these $c$-values are decreasing with $n$? I obviously believe it to be true given the empirical data but a proof would be excellent. Or just a reference to any theorems in probability that might be relevant to proving this. – Cyclicduck Dec 14 '23 at 03:37
  • For instance https://www.ams.org/journals/jams/2004-17-04/S0894-0347-04-00459-X/S0894-0347-04-00459-X.pdf seems close, but entropy isn't exactly what I'm looking for. – Cyclicduck Dec 14 '23 at 04:11