1

I created a neural network and I have trouble getting it to train. I followed all advice in this post: What should I do when my neural network doesn't learn? but have not had success yet.

Then I decided to create a very simple dataset to check if the model is able to learn at all. I expected to see the model learning the dataset perfectly and overfitting. Instead it failed to learn. The loss stops going down quite early on and the model is unable to replicate even this tiny dataset.

The complexity of the model was increased iteratively with no visible effect. Assume that the complexity is large enough to learn the patterns, but not too large to make optimization hard.

So maybe I am making the wrong assumption.

If you have a neural network and you train it on say, 30 examples. Should you not expect it to learn these 30 examples well enough to be able to replicate them?

Or are there situations where the 30 examples might be too different from each other for the model to find a pattern? Even assuming that the 30 examples are generated using the same process/function?

  • It's difficult to answer this question because it's a bit broad: considering the total space of all neural networks and all datasets, could this happen? It's a bit like whether there can ever be a cloud with a particular shape. – Arya McCarthy Aug 02 '22 at 20:05
  • @arya-mccarthy My intuition is that it is a very fundamental question about all neural networks. Am I mistaken? I suspect, that my assumption is wrong. With a tiny dataset the network is either big enough to memorize all individual examples, or the examples need to be similar enough for the network to be able to learn iteratively over them, or else it will never get the right pattern. – berrygreen Aug 02 '22 at 20:32
  • Then what's your question? – Arya McCarthy Aug 02 '22 at 20:33
  • @arya-mccarthy Well I was hoping to get other people's opinion on whether my thinking is in the right direction. Seems like you believe one cannot say for sure either way? – berrygreen Aug 02 '22 at 20:52

0 Answers0