In a simple regression context: $$ y = \alpha + \beta x + e $$ We can estimate beta from: $$ \hat{\beta} = \frac{cov(x,y)}{var(x)} = \rho_{xy} \frac{\sigma_y}{\sigma_x} $$ This last decomposition is useful in empirical cases as it allows to calculate correlation separately from standard deviation. In my use case, for example, it is reccomended to use a longer window to calculate correlation and a smaller window to calculate standard deviations, since correlations are stronger over longer periods of time.
In a multiple regression case, we use: $$ \hat{\beta} = (X'X)^{-1}X'Y $$ I am not able to separate this correlation/standard deviations concept using the formula above. It seems that it would be possible to make something similar. I am getting a bit lost on the matrix algebra. If anyone can provide me some intuition on where to go or what book to look to I would be very very happy.