5

I am reading "Asymptotic Statistics" by A.W van der Vaart and I am stuck with an exercise of chapter 2. Here is the question : for each $n \in \mathbb{N}$, let $U_n$ be uniformly distributed on the unit sphere $S^{n-1} \subseteq \mathbb{R}^n$. Show that the random vectors $\sqrt{n}(U_{n,1},U_{n,2})$ converge in distribution to a pair of independent standard normal variables.

Maybe the solution is extremely stupid but I don't know where to start. Could you provide me some hint ?

Also, I am sorry if the solution is already available on the internet, I couldn't find it.

User1865345
  • 8,202
  • Are $U_{n,1}$ and $U_{n,2}$ the first and second coordinates of $U_n$? – fblundun May 08 '21 at 20:55
  • @fblundun yes exactly. Sorry for the imprecision – Akurishen May 09 '21 at 06:14
  • 3
    Although I wouldn't recommend it -- I like solutions that require the least work possible because they tend to be the most insightful -- it is possible to obtain an explicit formula for the distribution of $(U_{n,1},U_{n,2}):$ see https://stats.stackexchange.com/a/520811/919. You can then obtain the limit easily. – whuber May 11 '21 at 17:18
  • 1
    @whuber oh thanks for sharing, appreciate it. I was thinking about polar coordinates but couldn't write the argument. Thanks a lot for your awnser. – Akurishen May 11 '21 at 18:24

2 Answers2

4

This answer is essentially similar to @Thomas Lumley's, but hopefully to add more clarity by explicitly justifying some key steps.

Let $X_n = (X_{n, 1}, \ldots, X_{n, n}) \sim N_n(0, I_{(n)})$ (i.e., $X_{n, 1}, \ldots, X_{n, n} \text{ i.i.d.} \sim N(0, 1)$), then it follows by a property of spherical distribution (see, e.g., Theorem 1.5.6 in Aspects of Multivariate Statistical Theory by Robb J. Muirhead) that $X_n/\|X_n\| \sim \text{Uniform}(S_{n - 1})$, hence \begin{align} \sqrt{n}(U_{n, 1}, U_{n, 2}) \overset{d}{=} \frac{\sqrt{n}}{\|X_n\|}(X_{n, 1}, X_{n, 2}) = \frac{1}{\sqrt{\frac{X_{n, 1}^2 + \cdots + X_{n, n}^2}{n}}}(X_{n, 1}, X_{n, 2}). \tag{1} \end{align} By the weak law of large numbers, $\frac{X_{n, 1}^2 + \cdots + X_{n, n}^2}{n}$ converges to $E[Z^2] = 1$ in probability, where $Z \sim N(0, 1)$, whence $\frac{1}{\sqrt{\frac{X_{n, 1}^2 + \cdots + X_{n, n}^2}{n}}}$ converges to $1$ in probability by the continuous mapping theorem. On the other hand, $(X_{n, 1}, X_{n, 2}) \sim N_2(0, I_{(2)})$ for all $n$. It thus follows by Slutsky's theorem that
\begin{align} \frac{1}{\sqrt{\frac{X_{n, 1}^2 + \cdots + X_{n, n}^2}{n}}}(X_{n, 1}, X_{n, 2}) \to_d N_2(0, I_{(2)}). \tag{2} \end{align}

Combining $(1)$ and $(2)$ gives $\sqrt{n}(U_{n, 1}, U_{n, 2}) \to_d N_2(0, I_{(2)})$.

Zhanxiong
  • 18,524
  • 1
  • 40
  • 73
2

In outline: one approach is to think of generating $U_n$ by generating $n$ iid standard Normals $Z_{n,1},\ldots,Z_{n,n}$ and defining $$U_{n,i}=\frac{Z_{n,i}}{\sqrt{\sum_j Z_{n,j}^2}}$$

As $n\to\infty$, the denominator converges to its expected value (eg, by Chebyshev's inequality) and can be treated as a constant. The expected value is a multiple of $\sqrt{n}$, so rescaling any finite set of $U_{n,i}$ by $\sqrt{n}$ will asymptotically give independent Gaussians that are just multiples of the corresponding $Z_{n,i}$.

Update: the result is fairly straightforward but the implications are non-intuitive. $U_{n,1}= O_p(n^{-1/2})$, for $U_n$ uniformly distributed on $S^n$, so nearly all of the area of $S^n$ is within $O(n^{-1/2})$ of the equator for large $n$(!!).

Thomas Lumley
  • 38,062