My guess is that if some conditions hold ( $y_i = \beta_0 + \beta_1 x_i + \epsilon_i$, where $\epsilon_i$ are normally distributed) OLS is the UMVUE, and any further information about $y_i$ is random error, so bagging wouldn't improve inference or predictions.
But, in some conditions ($\epsilon_i$ are not normally distributed, presence of unmeasured covariates, etc) I would expect bagging to improve inference and predictions over OLS.
More information about this situation is in this paper http://www.math.univ-toulouse.fr/~agarivie/Telecom/apprentissage/articles/BaggingML.pdf
A surprising by-product is that bagging is harmful for ordinary least squares linear regression involving all variables. Breiman explains the failure of bagging by the stability of ordinary least squares: for stable procedures, averaging predictors trained on several independent datasets is better approximated by averaging over a single dataset drawn from the data distribution (original predictor), than by averaging over several bootstrap samples (bagged predictor). This statement acknowledges that the variance reduction argument reaches its limits when bagging fails.