cross_entropy

ivy.neural_net_functional.losses.cross_entropy(true, pred, axis=- 1, epsilon=1e-07)[source]

Computes cross entropy between predicted and true discrete distrubtions.

Parameters
  • true (array) – True labels

  • pred (array) – predicted labels.

  • axis (int, optional) – The class dimension, default is -1.

  • epsilon (float, optional) – small constant to add to log functions, default is 1e-7

Returns

The cross entropy loss


Supported Frameworks:

empty jax_logo empty tf_logo empty pytorch_logo empty mxnet_logo empty numpy_logo empty