In the statistical courses I've taken, which are mostly introductory, when I have a model I would make hypotesis tests to reduce it to the simplest form and am effectively done. It is my understanding that this is not the case with predictive models since the goal isn't model reduction but predictiveness of the model based on metrics like R^2, Brier, ROC etc.
In my particular situation I want to fit a logistic regression on my data set. If I go by the rule of thumb of 15 obs. in the least frequent outcome per predictor then I have about 3 times the predictors than I should have to not overfit it. So I quite obviously need to reduce the number of predictors in my model.
So the question is, how does one proceed from the full model? Form what I've read that are three approaches(are there more?): feature selection, data(dimension?) reduction and shrinkage. How does one choose between them?