The problem with your proposed approach is that every predictor in itself may not correlate with the outcome, but interactions between them might. Or you might have a curvilinear relationship between a predictor and the outcome. (For instance, very low and very high BMIs are associated with higher morbidity than medium BMIs.)
This is really little different than model selection for OLS or pretty much any other model. You should always start with domain knowledge and not just feed your data into some model selection algorithm, because the latter approach is pretty much guaranteed to have you chasing noise. Reliably finding relevant variables out of a large pool requires enormous amounts of data.
So best to first pare down your candidate predictors. Possibly think about transformations like splines to model potential nonlinearities, or interactions. (Look, you suddenly again have lots of predictors - so the caveats above apply again.) Then you might want to look at automatic model selection tools, like stepwise regression based on information criteria or statistical testing. This is highly problematic if you want to do inference (p values), but can be defended if your goal is prediction. Just don't go overboard with this and don't trust an automatic tool too much, because it will not save you from overfitting. Absolutely have a look at Algorithms for automatic model selection.
Ideally, bootstrap your model selection to get a feeling for how variable it is. Are some predictors always selected? Are others sometimes selected and sometimes not? One key outcome of your exercise should be a lot of humility as to whether you really found the "best" model.
Also, I would recommend you keep a holdout set, or wrap all this in a cross-validation setup. Assess probabilistic predictions from your logistic regression using proper scoring rules. Compare the performance of your selected model to an extremely simple model with only a few predictors, as selected by domain knowledge - chances are that a very simple model may be quite hard to beat by a more complex one.