You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Here, u used tf.nn.softmax_cross_entropy_with_logits() function, which I thought is inner implemented softmax to normalize data and then calcuate cross entropy, applied on finaloutput which has already been applied by 'softmax'. So is there more reasonable to use:
Hello there!
Your code is pretty clear and helpful to others, thank you for your effort!
Here, I raised a question about 'softmax', u write about this:
Here, u used tf.nn.softmax_cross_entropy_with_logits() function, which I thought is inner implemented softmax to normalize data and then calcuate cross entropy, applied on finaloutput which has already been applied by 'softmax'. So is there more reasonable to use:
cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=output_fc8, labels=y))
?
That's to say, use output_fc8 result to calculate cross entropy.
The text was updated successfully, but these errors were encountered: