2

Let $X_1, X_2, \ldots $ be an infinite sequence of i.i.d. random variables with $E(X_i)=\mu$ and $\mbox{Var}(X_i) < \infty$.

The law of large numbers states $\lim_{n \rightarrow \infty} \sum_{i=1}^{n} \frac{X_i}{n} = \mu$.

Wikipedia mentions that "other formulas that look similar are not verified".

In other words $\lim_{n \rightarrow \infty} \sum_{i=1}^{n} X_i = n\mu$ is not correct?

Is there a counter-example or an obvious reason why this is not the case?

Xi'an
  • 105,342
GCru
  • 235
  • 1
    Note also that the Law of Large Numbers does not require the existence of the variance of the $X_i$'s. – Xi'an Oct 13 '22 at 06:49
  • 5
    A warning that arises every time in the classroom:$$\lim_{n \rightarrow \infty} \sum_{i=1}^{n} X_i = n\mu$$does not make sense (mathematically) since the lhs does not depend on $n$ as a limit, while the rhs does. – Xi'an Oct 13 '22 at 06:50
  • Thank you @Xi'an. So if $E(X_i)=\mu/n$ where $i=1,2,...n$ then I assume $\lim_{n \rightarrow \infty} \sum_{i=1}^n X_i =\mu$ would make sense? – GCru Oct 13 '22 at 07:17
  • 3
    No it does not make sense either: $\mathbb E(X_i)=\mu/n$ has the lhs depending on $i$ and the rhs depending on $n$. – Xi'an Oct 13 '22 at 07:23
  • Thanks again @Xi'an. I think I understand it better now. It also cannot be true, because with $E(X_i)= \mu/n$ and $\mbox{Var}(X_i)>0$, we have $E (\sum_{i=1}^n X_i) = \mu$, but $\mbox{Var}(\sum_{i=1}^n X_i)$ does not approach zero, as $n$ increases. – GCru Oct 13 '22 at 10:05
  • 2
    Sorry, but the fundamental reason is that $\mathbb E(X_i)=\mu/n$ does not make sense mathematically. The random variable $X_i$ for one given $i\in\mathbb N^*$, does not depend on $n$. – Xi'an Oct 13 '22 at 10:12
  • Thanks again @Xi'an, your comments helped a lot. I am modelling an actual situation with sets containing different numbers of random variables, where the expected value of the random variables in a particular set is $\mu/n$ with $n$ the cardinal number of the set. I was attempting to link this model to the law of large numbers, but this is obviously not possible. – GCru Oct 13 '22 at 13:20
  • 1
    In that case, you need to rephrase the question differently. It sounds like you are dealing with the law of large numbers for triangular arrays. – Xi'an Oct 14 '22 at 14:10

1 Answers1

2

What do LLNs say?

At the risk of getting too oversimplistic, for independent sequence $\langle X_i\rangle_{i\in\mathbb N},$ they say the partial sum (or average, precisely)

$$\frac{1}{n}(X_1+X_2+\ldots+ X_n) \tag 1$$

converges to common mean $\mu$ in certain sense provided second moment or forth moment be finite and uniformly bounded depending on what convergence one is bothered with.

When $\rm i.i.d.$ is imposed, only the first moment needs to be finite.

What is Wikipedia saying?

It is only saying form like

$$\sum_{i=1}^n X_i -n\times \bar X$$

doesn't make sense for

not only does it not converge toward zero as $n$ increases, but it tends to increase in absolute value as $n$ increases.

Finally as Xi'an noted in the comments, be careful with indexing set and limiting variable.


Further Read:

$\rm [I]$ A First Look at Rigorous Probability Theory, Jeffrey S. Rosenthal, World Scientific Publishing, $2006, $ section $5.3,~5.4,$ pp. $60-64.$

User1865345
  • 8,202