This question Why are all regression predictors in a balanced factorial ANOVA orthogonal? asks why all predictors in a balanced ANOVA setting are orthogonal. My question is in what sense are the predictors orthogonal?
The model $$y_{ijk} = \mu + \alpha_i + \eta_j + \gamma_{ij} + \epsilon_{ijk}$$ can be understood as a linear regression model, given as $$y = X\beta + \epsilon$$ in matrix notation (see Example 7.2.1 in the other question). I know that the OLS estimate is $$\hat\beta = (X'X)^{-1}(X'y)$$ and I also know that the underlying theory of statistical tests is based on the assumption that $y$ is normally distributed with mean $X\beta$ and variance-covariance matrix $\sigma^2 I$, where $I$ denotes the identity matrix in the appropriate dimensions and $\sigma^2$ is a constant. The properties of a normal distribution imply that $\hat\beta$ is also normally distributed with mean $\beta$ and variance-covariance matrix $\sigma^2(X'X)^{-1}$.
If all predictors were orthogonal, I would expect that $X'X$ is a diagonal matrix. Otherweise there is a correlation between the coefficients, whic means they cannot be orthogonal. However, a quick check on the matrix given in the example shows that this is not the case. This confuses me.