6

I am looking for an example of a sequence of r.v. $X_n$ that converges to $X$ in $L_1$, but such that $X_n^2$ does not converge to $X^2$ in $L_1$.

Anyone has something in mind?

Tim
  • 138,066
  • I’m not sure why this has to be about random variables; it sounds like a real analysis question. There’s an answer on our sister site: https://math.stackexchange.com/questions/811765/counterexamples-for-l1-convergence-not-implying-l2-convergence-and-vice-versa – Arya McCarthy May 31 '21 at 14:54
  • 2
    @AryaMcCarthy it's not evident for most the relation between real analysis and probability. – Manuel May 31 '21 at 15:22

2 Answers2

10

Let $X_n \sim Be(n^{-1})$ that is a Bernoulli random variable. Now consider $Y_n = \sqrt{n}X_n$.

It is straight forward that $E(|Y_n-0|)= n^{-1/2}$. Hence $Y_n \overset{L_1}{\to} 0$.

Since $E(|Y_n^2 - 0^2|) = 1$ you get that $Y_n^2 \not \overset{L_1}{\to} 0^2$



As a side note, this examples shows that $L_1$ convergence is not preserved by contiuous transformations i.e if $g : \mathbb R \to \mathbb R$ is a continuous function then $$ X_n \overset{L_1}{\to} X \quad \not \Rightarrow \quad g(X_n) \overset{L_1}{\to} g(X)$$

Convergence in distribution, probability and and almost everywhere are all proserved by continuous transformations.

Manuel
  • 1,679
  • Could you also give $f_n = n^{-1}1_{[0,n]}$? That way it converges to $||f_n||_{1}=1$ but when you take the square it does not converge at all. – Ariel May 31 '21 at 15:32
  • @Ariel it is not clear to me how is $f_n$ defined as a random variable. – Manuel May 31 '21 at 15:59
  • Since we just need it to be a $\mathcal{F}$-measurable function with some work I think we could think of it as r.v.s for given $n$s. Mostly though I just wanted to check my intuition. I think your earlier comment is correct though and $f_n$ does not work because it does not converge in L1! (I should have drawn a picture :) ) – Ariel May 31 '21 at 16:31
  • Thank you! this is what I was looking for – Rayan Mezher Jun 01 '21 at 07:50
  • 1
    @RayanMezher you can accept the answer if you consider so. – Manuel Jun 01 '21 at 18:23
4

In the answer of the linked question over on Math.SE and a comment of this page, it is suggested to take $$ f_n = n^{-1} \mathbf 1_{[0,n]}$$ actually this does not work, and this is because this example solves the converse problem ($L^2$ but not $L^1$), and further on a space that is not a probability space ($\mathbb R$ with uniform measure). Note that $\|f_n\|_{L^1} \equiv 1$, which shows that the $L^1$ norm converges, but this does not give you convergence in $L^1$ norm, as the almost everywhere limit is $f=0$, and if $f_n$ did converge in $L^1$ then it would have to converge to the a.e. limit i.e. $\|f_n - 0 \|_{L^1} \to 0$. The $L^2$ norm however, converges to $0$, proving convergence in $L^2$ norm to zero.

(In fact there is even a slight discrepancy in the question, as the difference in $L^2$ can be written $\|f_n - f\|^2_{L^2} = \|(f_n-f)^2\|_{L^1}$ but convergence of the square in $L^1$ is $\|f_n^2 - f^2\|_{L^1}$. Thankfully this nonlinearity issue disappears when the $f$ is zero.)

A correct example was given in the math.SE question's body: $\sqrt n\mathbf 1_{[0,1/n]}$ (with uniform probability on $[0,1]$). Another perhaps more trivial (and perhaps not in the spirit of the question) example can be given via constant (in $n$) sequences, simply because there are functions in $L^1$ whose square are not in $L^1$, so the question of their convergence has no meaning. Any random variable with a mean and without variance will do; an example using the same uniform probabilty on $[0,1]$ is $\frac1{\sqrt x}$.

PS One can prove partial results. For example, if $f,f_n$ are almost surely bounded uniformly in $n$, $\|f\|_{L^\infty} , \|f_n\|_{L^\infty} \le M$, then observe $$\|f_n^2 - f^2\|_{L^1} = \|(f_n - f)(f_n+f)\|_{L^1}\le M\| f_n - f\|_{L^1} \to 0.$$ This is of course consistent with the example of Manuel as his $Y_n$ is not a.s. bounded in $n$.