I have written a Binary cross-entropy function like this:
def BinaryCrossEntropy(self, y_true, y_pred):
y_pred = np.clip(y_pred, 1e-7, 1 - 1e-7)
term_0 = (1 - y_true) * np.log(1 - y_pred + 1e-7)
term_1 = y_true * np.log(y_pred + 1e-7)
loss = -1.0 * np.mean(term_0+term_1, axis=0) ## PROBLEM IS HERE
return loss
In the second last line, I have multiplied the final answer by '-1.0' as per the formula but when my y_true and y_pred are exactly equal it gives me something ambiguous...
y_true = np.array([1.0, 1.0, 0.0, 0.0])
y_pred = np.array([1.0, 1.0, 0.0, 0.0])
print(BinaryCrossEntropy(y_true, y_pred))
Output:
>>> -0.0
Is there any solution to get rid of that minus sign when my answer is 0.0 because in another case when my y_true and y_pred are different it works perfectly fine.