Specifically, I suppose I wonder about this statement:
Future major versions of TensorFlow will allow gradients to flow into the labels input on backprop by default.
Which is shown when I use tf.nn.softmax_cross_entropy_with_logits. In the same message it urges me to have a look at tf.nn.softmax_cross_entropy_with_logits_v2. I looked through the documentation but it only states that for tf.nn.softmax_cross_entropy_with_logits_v2:
Backpropagation will happen into both logits and labels. To disallow backpropagation into labels, pass label tensors through a stop_gradients before feeding it to this function.
as opposed to, tf.nn.softmax_cross_entropy_with_logits's:
Backpropagation will happen only into logits.
Being very new to the subject (I'm trying to make my way through some basic tutorials) those statements are not very clear. I have a shallow understanding of backpropagation but what does the previous statement actually mean? How are backpropagation and the labels connected? And how does this change how I work with tf.nn.softmax_cross_entropy_with_logits_v2 as opposed to the original?
softmax_..._with_logits_v2will work assoftmax_with_logits? (Or I might use tf.stop_gradient on the labels variable.) – Christian Eriksson Feb 07 '18 at 22:04