Let me start off by saying I already know all the rigorous formulas, but let me explain why I still feel like something is missing in my understanding. There is no need for any answer going over e.g. the variance calculation for sums of i.i.d. random variables! That is not what this question is about.
Basically, it comes down to this question in @whuber's amazing answer https://stats.stackexchange.com/a/3904/264044:
What is special about the sum? Why don't we have central limit theorems for other mathematical combinations of numbers such as their product or their maximum? (It turns out we do, but they are not quite so general nor do they always have such a clean, simple conclusion unless they can be reduced to the CLT.) The sequences of $m_n$ and $s_n$ are not unique but they're almost unique in the sense that eventually they have to approximate the expectation of the sum of $n$ tickets and the standard deviation of the sum, respectively ...
The standard deviation is one measure of the spread of values, but it is by no means the only one nor is it the most "natural," either historically or for many applications. (Many people would choose something like a median absolute deviation from the median, for instance.)
Why does the SD appear in such an essential way?
More rigorously: we have $S_n$ a sum of i.i.d. $X_n$. For example, we may plot the binomial coefficients (i.i.d. sums of Bernoullis/Rademachers) and notice they seem to have the "same shape" --- in fact more and more like the same shape, once one scales $\tilde S_n:=\frac{S_n-m_n}{s_n}$. Finding what $m_n$ is easy: it's a linear shift, and expectation being linear, everything's fine. Let's just assume the $X_n$ are shifted to be mean $0$.
But now we have to figure out what $s_n$ "keeps the shape". Our idea that it "approaches some limiting shape" is a statement about convergence of distributions of course, and convergence of distributions means in particular all moments will converge/stabilize to some value, and in fact more generally, all $\mathbb E[f(\tilde S_n)]$ should converge/stabilize to some value.
And somehow, all of all these choices of $f$, we choose $f(x)=x^2$, and due to some algebraic miracle, things work out perfectly and we see that $s_n$ must be (or at least converge to as $n\to\infty$) some constant multiple of $\sqrt n$.
So basically, from this perspective, we see that $s_n$ is in a sense very fundamental/universal (it's the values that make $\mathbb E[f(\tilde S_n)]$ stabilize for all "nice" $f$), but simply due to some algebraic miracle, it is only for $f(x)=x^2$ that we can compute explicitly the correct asymptotic of $s_n$. So it almost seems like the "2" nature of $s_n \asymp n^{1/2}$ is by accident, but the "universality" of $s_n$ tell us on the other hand that there's nothing on accident about this --- it was completely inevitable.
Question: Can some one shed some light on any deeper reason why $s_n \asymp n^{1/2}$? Perhaps even if one can do the calculation with a different $f$, and see that still $n^{1/2}$ pops out, that would be insightful?