2

enter image description here

I am wondering what is the implication of the above relation/theorem. I know how to prove this using "sphering $Y$" but I am failing to get intuitive understanding of the theorem. What does it mean for $(Y-\mu)'\Sigma^{-1}(Y-\mu)$ to be distributed as $\chi^{2}_{n}$ ? What is the implication?

hmi2015
  • 263
  • For uni-variate variable you know that $\frac{y-\mu}{\sigma} \sim N(0,1)$ and the square of standard normal is $\chi^2(1)$

    Now you can extend it to multivariate cases and remember that $(Y-\mu)'\Sigma^{-1}(Y-\mu)$ is quadratic forms

    – Deep North Nov 03 '15 at 05:41
  • So, my question is,what's the intuitive implication of quadratic form $(Y-\mu)'\Sigma^{-1}(Y-\mu)$ having same distribution as $\chi^{2}_{n}$ – hmi2015 Nov 03 '15 at 08:20
  • What do you mean by "intuitive implication"? In some settings, it is a useful result. Are you asking for examples of its uses? – user603 Nov 03 '15 at 09:49
  • The final picture in my explanation of Mahalanobis distance illustrates the sense in which $(Y-\mu)^\prime\Sigma^{-1}(Y-\mu)$ is a sum of squares of standard Normal variables and therefore has a $\chi^2_n$ distribution (by definition). Maybe that's sufficiently intuitive? – whuber Jul 12 '22 at 15:03
  • would you mind tell me the where you have seen this distribution? i wanna cite this, thx – julia Dec 01 '22 at 12:58
  • @julia Linear Models Theory texts are a good thing to check out for this, such as Graybill's (I believe his quad form results are in Chapter 2? Don't have it on me though). – John Madden Jan 06 '23 at 21:04

2 Answers2

1

If you by "intuitive understanding" mean how you can see this result instantly i your mind's eye:

  • subtracting $\mu$ centers to zero mean

  • rewrite the quadratic form to $ \left[\Sigma^{-1/2}(Y-\mu)\right]^T \left[\Sigma^{-1/2}(Y-\mu)\right]$. Calculate the covariance matrix of the bracketed term, you will find it is the identity matrix.

  • This shows that the quadratic form has **the same distribution as the sum of squares of $n$ iid standard normal random variables.

It is not clear what you mean by asking

I am wondering what is the implication of the above relation/theorem

The only implication is that you now know the distribution of the quadratic form, and that might be useful.

1

What it says to me is that every chi square distribution can be thought of as describing the variance of a symmetric multivariate normal distribution (smnd). Which might be a very easy visualization of the kind of question the chi square is asking. This would relate to https://en.wikipedia.org/wiki/Multivariate_normal_distribution#Interval , https://en.wikipedia.org/wiki/Chi-squared_distribution#History and the last 2 sentences (scroll up) above https://en.wikipedia.org/wiki/Chi-squared_distribution#Probability_density_function

So a chi square test is asking how well your data corresponds to a multivariate normal distribution of appropriate dimensionality (which you specify using degrees of freedom) this measures whether your data is likely to arise randomly based on multinomial assumptions. agree?

  • Do you really mean "multinomial" and not "multinormal"? This is a common typographical error. Not all chi-squared tests are tests of multinomial distributions and in such cases the chi-squared distribution is just an approximation, anyway, suggesting the reference (if it is intended) is at best tangential to the current question. – whuber Aug 30 '22 at 15:16
  • 1
    I say multinomial based on the link to wikipedia: " ...Pearson showed that the chi-squared distribution arose from such a multivariate normal approximation to the multinomial distribution, taking careful account of the statistical dependence..." I dont grok middle terms as well as geometrical visualizations so I may have made an error. – Mark Dranias Aug 30 '22 at 15:22