I would like to do model selection using backward stepwise procedure and cross validation. https://www.otexts.org/fpp/5/3
I have used stepAIC in MASS package for predictors selection and I would like to see whether cross validation is a better criteria. In other words, I am trying to use cross-validation as a criterion in backward selection procedure instead of AIC.
The procedure should start with a full model, drop one predictor at a time, keep the variable if it improves CV and continue until no improvements can be made.
I think that the rfe function in the caret package may be able to do the job. I would also like to do bootstrapping if possible. If so, how should I specify it? Or are there any functions that can do that? I have tried writing my own function, but it seems to be beyond my ability. Many thanks.
validatein thermspackage has the option of allowing backwards selection based on AIC. Do read some of the many posts here on why stepwise methods aren't so good & consider whether (a) you need to do post-hoc model selection, & (b) you ought to apply shrinkage e.g. lasso. – Scortchi - Reinstate Monica Feb 21 '14 at 12:03caretviatrain. Usingmethod = 'lmStepAIC'ormethod = 'glmStepAIC'will resample that entire process (if you are interested in that sort of thing) – topepo Feb 24 '14 at 15:54