What’s the difference between sparse_softmax_cross_entropy_with_logits and softmax_cross_entropy_with_logits?

Having two different functions is a convenience, as they produce the same result. The difference is simple: For sparse_softmax_cross_entropy_with_logits, labels must have the shape [batch_size] and the dtype int32 or int64. Each label is an int in range [0, num_classes-1]. For softmax_cross_entropy_with_logits, labels must have the shape [batch_size, num_classes] and dtype float32 or float64. Labels … Read more