I am trying to implement Linear Discriminant Analysis for face recognition. I have 3 classes and each classes have 10 image each. The dimension of matrix in class A, B and C is 10*500. So each row will represent an image.
If I find the mean matrix of each class I am getting dimension of 1*500. That is I will be adding the row and divide by 10. Global mean matrix of all classes I am getting 1*500 dimension.
Within Scatter Matrix Sw= The dimension of Matrix is 10*10 Matrix.
Between Scatter Matrix Sb= The dimension of Matrix is 1*1.
Next Step is I have to find Inverse(Sw)*Sb. But the matrix dimension is totally different. I know I am doing some mistake but I don't know where?
Could you please help me to solve this problem?
Can you please tell me how the dimension of the matrix should be?
Bshould be the same size as Within-matrixW. BecauseB=T-WwhereTis Total scatter matrix (matrix for the whole sample). – ttnphns Apr 01 '14 at 21:09