I understand that the standard error of the regression estimate is used over the variance because it's in the same units as the predictor, but I'm wondering if there may be a more intuitive statistic.
When I'm doing regression analysis, I don't really have a good sense of what an acceptable standard error is, other than that it should be low relative to the average value of $Y$. If instead of
$$s_{est}=\sqrt{\sum{(Y-Y')^2}\over {N-p}},$$
we used:
$$\sum\sqrt{(Y-Y')^2}\over {N-p}$$
then the interpretation would simply be the average absolute difference between $Y$ and $Y'$. Is there an easy way to interpret the standard error? If not, why don't we use the average absolute deviation?