5

I'm training a neural network. Normalization of inputs and outputs (training data) is carried out using min and max to a scale of [0-1].

I'm applying backpropagation learning algorithm. I need to get the error offset. i.e. error = actual output $-$ output

How do I scale my output [0-1] back to actual real values such as in zero to thousands range?

Nick Cox
  • 56,404
  • 8
  • 127
  • 185

2 Answers2

11

Because your output is in [0, 1], I guess you used some output functions for classification, such as sigmoid. However, your loss function is not for classification. Thus, I suggest either using classification loss (such as sigmoid cross entropy) with [0, 1] output values, or using regression loss (such as squared error) with linear output function. Then, you don't have to scale training output values nor network output values.

Nevertheless, you can convert from [0, 1] back to [a, b] using the following equation:

output_ab = output_01 * (b - a) + a,

or generally from [x, y] to [a, b] using the following equation:

value_ab = ((value_xy - x) / (y - x)) * (b - a) + a.

THN
  • 598
4

Using the same formula as you used to standardize from 0 to 1, now use true min and max to standardize to the true range, most commonly: Xi = (Xi - Xmin)/(Xmax-Xmin)

katya
  • 2,142