I understood bootstrapping in the statistical context.
Example: we have a sample of 1000 people. We want to know their mean. We pick 5 people at random (with replacement) for 20 times and we compute the mean of the 5 people for each extraction. We end up with 20 means and we can have an estimate of the real mean and of its error. Is this correct?
I was following an online course of machine learning and neural networks. The professor briefly mentioned bootstrapping. How would you use it in the context of machine learning?
I know how bagging is used (ensembles), but I am not sure how plain bootstrapping would be used.