In wikipedia it says
Multicollinearity does not reduce the predictive power or reliability of the model
as a whole, at least within the sample data set; it only affects calculations
regarding individual predictors.
Does the phrase "Multicollinearity does not reduce the predictive power" mean that multicollinearity does not change the predictive power when one knows the regression coefficients?
At the same page it is written:
So long as the underlying specification is correct, multicollinearity does not
actually bias results; it just produces large standard errors in the related
independent variables. More importantly, the usual use of regression is to take
coefficients from the model and then apply them to other data. Since
multicollinearity causes imprecise estimates of coefficient values, the
resulting out-of-sample predictions will also be imprecise.
First of all, it seems to me that these statements are contradictive. How comes that, it says that multicollinearity does not change predictive power and at the same time it says about large errors of coefficients, which lead to overfitting?
Second, what does it mean "large standard errors in the related independent variables"? Is this about "the errors of coefficients"?