I view "overfitting" as an example of a bias-variance trade-off (I wouldn't call it a "dilemma") that has gone too far toward the variance end.
The usual example I have in mind for the bias-variance trade-off is the case of a linear model in which all covariates have some effect, but where it would best to drop those whose effects are negligible, incurring some bias in order to reduce the variance.
The usual example I have in mind for "overfitting" is to imagine that the truth is contained within the class of models being considered, so that things could be unbiased, but then the class of models is expanded through added parameters, making it overly flexible, so that the fitted model resembles the observed data quite closely but not necessarily the underlying population or process.
I actually quite dislike the term "overfitting". I'd rather say "fitting too complex a model".