0

I am using a multilayer perceptron model to predict urban temperatures. I have standardized the independent variables before training the model. However, I have not standardized the dependent variable. I chose to do this in order to maintain the overall interpretability of the model results, however I am not sure if this is the way to do it or numerically I am making a mistake.

What is recommended in your opinion?

  • I think a relevant discussion can be found here:

    https://stats.stackexchange.com/questions/111467/is-it-necessary-to-scale-the-target-value-in-addition-to-scaling-features-for-re

    – Janosch May 18 '22 at 09:11
  • 2
  • It does not fully answer my question. The discussion on the forum does not reach an agreement regarding best practices when implementing ANN. @Janosch – Monica Pena May 18 '22 at 12:33
  • 1
    Here is a link mentioned in the discussion. https://machinelearningmastery.com/how-to-improve-neural-network-stability-and-modeling-performance-with-data-scaling/

    To be fair its not the best experiment imo, but I think the message still comes across. Scaling the target Variable is useful when training gradient-based model, like ANNs, as the scale of the target variable effects the size of the error. To large errors can lead to unstable training. As weights get to strong updates, relative to their own scale.

    – Janosch May 18 '22 at 12:56

0 Answers0