0

Good day for everybody.

My name is Brayan, I am mechanical engineer student, in this moment I have a problem with my first neural network.

The problem is:

I found in kaggle a dataset (50 datas) about 3D printing parameter settings in FDM process and their relationship with mechanical properties. My objecive is predict mechanical properties, so, I have a regressión problem.

I have used hyperparameter tuning with functions like HPARAMS and Keras Tuner. I found the optimal architecture of my neural network, is the next:

lr_schedule = tf.keras.optimizers.schedules.InverseTimeDecay(
  0.01,
  decay_steps=500,
  decay_rate=1,
  staircase=False)

def get_optimizer(): return tf.keras.optimizers.SGD(lr_schedule)

model = Sequential()

Creación de capas

model.add(Dense(380, input_dim = 9, kernel_regularizer=regularizers.l2(0.01), activation = 'sigmoid')) model.add(Dropout(0.64)) model.add(Dense(3, kernel_regularizer=regularizers.l2(0.016), activation='relu'))

opt = get_optimizer() model.compile(optimizer=opt, loss='mean_squared_error', metrics=["mae"])

Entrenamiento

model_history = model.fit(x_train_norm, y_train, validation_data=(x_test_norm, y_test), batch_size = 4, epochs = 20000, verbose=1) print("Entrenamiento Finalizado")

The model converge fast, but in a few epochs all parameters doesnt change with the time (Loss, Val Loss, MAE, Val MAE)

Last values are: Loss: 261.76 mae: 7.70 val_loss: 550.52 val:mae: 10.68

enter image description here

enter image description here

I don´t have idea how can I improve the performance of my neural network. I have tried add L2 regularizatión, dropout layer, but is not sufficient.

Somebody can help me, please?

Thank you so much !

  • "My Neural Network is not learning" It's hard to tell from the figures but isn't the validation and training errors (MAE) going down? It looks like validation error starts at 25 and and goes down below 15. Either way, it's difficult to say based on this why the model isn't performing better as it could be so many reasons. Is the data such that you can expect very good performance? Or is it very complex, noisy and not very big (in which case poor performance might be expected). Good first step is to forget about validation for now and first make sure your model can overfit training data. – Lulu May 31 '22 at 10:11
  • p.s. it's completely normal that the rate of improvement slows after a while so there is nothing unusual about that. If it feels like it stops very quickly (hard to say from this graph) then your learning rate could be too big initially. or maybe you decay it too fast. – Lulu May 31 '22 at 10:13

0 Answers0