I have seen in certain implementations, the network is tested and trained at the same time. Like, the model is trained for 1 epoch, then it is tested with those weights, then for another epoch and so on and so forth. Is the reason for this so that we can know when the model has learnt the best parameters?
Asked
Active
Viewed 377 times
0
-
It sounds like you're describing early stopping. Is this correct? Or are you asking something else? https://stats.stackexchange.com/questions/231061/how-to-use-early-stopping-properly-for-training-deep-neural-network?r=SearchResults&s=3|0.0000 – Sycorax Jul 01 '20 at 23:18
1 Answers
1
You can test and train at the same time so long as you test on a data point first (forward validation) and then use that data point to train.
To understand this compared with testing separately from training, consider the batch size with which you're testing and training. If your batch size is large, then it's more like you're testing and training separately. If it's small, then it's more like what you're describing.
The benefit of large batch size is resistance to noise in the training signal, which speeds learning. The benefit of small batch size is that you learn from important examples sooner, which speeds learning.
Neil G
- 15,219