Gradients all equal to zero does not necessarily imply any problem with the network. Both minima and maxima occur where the gradient is zero. So it’s possible that your network has arrived at a local minimum or maximum. Determining which is the case requires additional information.
A corner case that is somewhat unlikely is that some combination of RELU units has “died,” so that they give 0s for every input in your data set. But this is somewhat unlikely.
On the other hand, if your network has any bugs or mistakes in the code, then it’s impossible to interpret the model’s results. Always check for bugs.