1

I have written a Binary cross-entropy function like this:

def BinaryCrossEntropy(self, y_true, y_pred):
    y_pred = np.clip(y_pred, 1e-7, 1 - 1e-7)
    term_0 = (1 - y_true) * np.log(1 - y_pred + 1e-7)
    term_1 = y_true * np.log(y_pred + 1e-7)
    loss = -1.0 * np.mean(term_0+term_1, axis=0) ## PROBLEM IS HERE
    return loss

In the second last line, I have multiplied the final answer by '-1.0' as per the formula but when my y_true and y_pred are exactly equal it gives me something ambiguous...

y_true = np.array([1.0, 1.0, 0.0, 0.0])
y_pred = np.array([1.0, 1.0, 0.0, 0.0])
print(BinaryCrossEntropy(y_true, y_pred))

Output:

>>> -0.0

Is there any solution to get rid of that minus sign when my answer is 0.0 because in another case when my y_true and y_pred are different it works perfectly fine.

desertnaut
  • 52,940
  • 19
  • 125
  • 157
Pritish Mishra
  • 424
  • 5
  • 13
  • The issue you describe will never happen in practice; for the computation of CE, we use the probabilistic predictions (which will never be the same with the true labels), and not the hard labels `0/1` as you show here. – desertnaut May 27 '21 at 07:35

0 Answers0