ivy.neural_net_functional.losses.sparse_cross_entropy(true, pred, axis=- 1, epsilon=1e-07)[source]

Computes sparse cross entropy between logits and labels.

  • true (array) – True labels as logits.

  • pred (array) – predicted labels as logits.

  • axis (int, optional) – The class dimension, default is -1.

  • epsilon (float, optional) – small constant to add to log functions, default is 1e-7


The sparse cross entropy loss

Supported Frameworks:

empty jax_logo empty tf_logo empty pytorch_logo empty mxnet_logo empty numpy_logo empty