EDITED TO POSE DIFFERENT QUESTIONS THAN DUPLICATE:
this question poses the question:
Can standardized coefficients become greater than |1|? If yes, what does that mean and should they be excluded from the model? If yes, why?
The accepted answer can be summarized as:
Standardized coefficients can be greater than 1.00... They are a sign that you have some pretty serious collinearity.
The second answer there nitpicks on the meaning of standardization, and states that:
(when you fit a) regression (with only a single predictor) then it is mathematically impossible to see a coefficient outside of the -1 to 1 range since the slope will be the same as the correlation)
The two answers do not agree upon what to do with such coefficients, the first says:
Whether they should be excluded depends on why they happened - but probably not.
While the second:
some sources suggest dropping those variables.
To summarize both answers: Yes, it can happen - only with collinearity, never in simple regression.
Given the above, my questions are:
(1) Is multicollinearity the only situation in which the standardized coefficient will be greater than +- 1?
(2) Why can't a standardized coefficient in simple regression exede +-1? I mean, what is the logical explenation here? From my own (very non-mathematically oriented) point of view, why can't 1 standard deviation in one variable contribute more that one standard deviation to a second variable? Suppose we are predicting wages (y) by school years (x), and wages has a points at +6 s.d (people who make a lot of money), yet school years has only 3 s.d of data. Than why cant the standardized coefficient for school years be greater than 1?