Activations

Collection of Ivy activation functions.

ivy.neural_net_functional.activations.leaky_relu(x, alpha=0.2, f=None)[source]

Applies the leaky rectified linear unit function element-wise.

Parameters
  • x (array) – Input array.

  • alpha (float) – Negative slope for ReLU

  • f (ml_framework, optional) – Machine learning framework. Inferred from inputs if None.

Returns

The input array with leaky relu applied element-wise.

ivy.neural_net_functional.activations.relu(x, f=None)[source]

Applies the rectified linear unit function element-wise.

Parameters
  • x (array) – Input array.

  • f (ml_framework, optional) – Machine learning framework. Inferred from inputs if None.

Returns

The input array with relu applied element-wise.

ivy.neural_net_functional.activations.sigmoid(x, f=None)[source]

Applies the sigmoid function element-wise.

Parameters
  • x (array) – Input array.

  • f (ml_framework, optional) – Machine learning framework. Inferred from inputs if None.

Returns

The input array with sigmoid applied element-wise.

ivy.neural_net_functional.activations.softmax(x, f=None)[source]

Applies the softmax function element-wise.

Parameters
  • x (array) – Input array.

  • f (ml_framework, optional) – Machine learning framework. Inferred from inputs if None.

Returns

The input array with softmax applied element-wise.

ivy.neural_net_functional.activations.softplus(x, f=None)[source]

Applies the softplus function element-wise.

Parameters
  • x (array) – Input array.

  • f (ml_framework, optional) – Machine learning framework. Inferred from inputs if None.

Returns

The input array with softplus applied element-wise.

ivy.neural_net_functional.activations.tanh(x, f=None)[source]

Applies the tangent hyperbolic function element-wise.

Parameters
  • x (array) – Input array.

  • f (ml_framework, optional) – Machine learning framework. Inferred from inputs if None.

Returns

The input array with tanh applied element-wise.