2

I am computing the Frobenius norm of the difference between two covariance matrices, \begin{align} |\mathbf{C}-\mathbf{C}'|_F=\sqrt{\sum_{i,j}\left(c_{ij}-c'_{ij}\right)^2}. \end{align}

Each of these covariance matrices is a sample estimate constructed from a finite number $N_s$ of samples of a multivariate jointly Gaussian random vector,

\begin{align} \mathbf{C} = \frac{1}{N_s-1}\sum_{i=k}^{N_s}\mathbf{x}_k\mathbf{x}_k^\top \\ \mathbf{C}' = \frac{1}{N_s-1}\sum_{i=k}^{N_s}\mathbf{x'}_k\mathbf{x'}_k^\top \end{align}

Because these covariances are estimates, they will have errors about true values. I'm trying to propagate this uncertainty through to the Frobenius distance. In synthetic experiments, I'm getting distributions (generated by taking different samples of the random vectors) that look chi-squared, but I'm not sure that makes sense in this context. Is there a closed-form solution for the PDF of this norm (or its square)?

gKhagb
  • 71
  • The distribution should be closer to that of the root of the fourth power of a Normal distribution. This norm is a little unnatural in the context, because variances are better compared using ratios. For both these reasons I (strongly) suspect there will not be any nice analytic ("closed form") expression for the distribution of this norm. – whuber Jul 22 '22 at 16:22
  • Thanks @whuber. But am I right to think that the difference of covariances is a matrix of chi-squared variables, so that the norm is the square root of the sum of squared chi-square variables, similar to this question? – gKhagb Jul 22 '22 at 18:48
  • That question has the right flavor. But when you subtract one chi-squared variable from another (not independent!) chi-squared variable and square the result, you get something sort of like a square of a chi-squared variable. Upon taking the square root we would expect the result to be qualitatively like a chi-squared variable, but not exactly. It's distribution will be very difficult to express analytically. – whuber Jul 22 '22 at 19:03

0 Answers0