Suppose that we have a $\\n\times p$ matrix $\mathbf{M}$. Different transformations using different column-wise operators can lead to a new $\\p\times p$ symmetric matrix $\mathbf{S}$. For example, the covariance matrix $\mathbf{C}$ can be computed using the dot product operator, whereby each value of the covariance matrix is the dot product of two columns of the original matrix $\mathbf{M}$ (divided by $n-1$):
$\mathbf{C} = { 1 \over {n-1} } \mathbf{M}^{T} \cdot \mathbf{M}$
Similarly, the correlation matrix $\mathbf{P}$ can be defined by $\mathbf{P}_{ij} = \mathrm{corr}(C_i,C_j)$ where $\mathbf{C}_i$ and $\mathbf{C}_j$ are columns of $\mathbf{C}$, and $\mathrm{corr}$ is a measure of correlation like the Pearson product-moment coefficient. In this case, the operator is the bivariate correlation coefficient.
Does this operator-dependent transformation have a name?
X'Xmatrix. For further scholastic interest: Zegers, ten Berge (Psychometrica, 1985) unite by the single general formula 4 coefficients: identity coef, additivity coef (based on covariance coef), cosine, Pearson correlation. – ttnphns Mar 29 '16 at 13:14X'Xmatrix - there are many synonymous or equivalent, coming from people of different fields and backgrounds. Multivariate statistical data analysis is one of the oldest branch. There it is called Sums-of-Squares-and-Crossproducts matrix or simply Crossproduct matrix. I don't recommend much using words Gramian or Gram due to reasons pointed in my comment above. – ttnphns Mar 29 '16 at 18:04