3

I've read this regarding the difference between epoch and mini-batch.

To clarify:

  • With an epoch value of 1000 and batch size of 50, does that mean that the model will use each data point exactly 1000 times in such an (random) order where at each iteration only 50 data points are used for optimization? (meaning a total of 50*1000 calculations?)
  • Is every data point used exactly 1000 times?
gunes
  • 57,205

1 Answers1

2

Yes, each iteration uses mini-batch number of points (here 50) and there will be 20 iterations in an epoch. An epoch ends when we go over all the training samples exactly once. This means every training point is used exactly 1000 times.

gunes
  • 57,205
  • I see, so the number of iterations is epoch / batch-size and the number of total calculations is epoch * number-of-points (as each point is used to calculate the rate of change of error epoch amount of times?) – alexandrosangeli Feb 02 '21 at 17:35
  • if by calculation you mean a training sample going through the alg, yes that’s correct – gunes Feb 02 '21 at 17:37
  • Yes that's what I mean. Thank you. – alexandrosangeli Feb 02 '21 at 17:40
  • That's not true. 1 epoch means that 1 data point will be seen exactly one time (except there are duplicates in the dataset). So 50 epochs means that each datapoint is seen 50 times – Daniel Wiczew Dec 14 '22 at 15:19