gelu

ivy.neural_net_functional.activations.gelu(x, approximate=True, f=None)[source]

Applies the Gaussian error linear unit (GELU) activation function.

Parameters
  • x (array) – Input array.

  • approximate (bool, optional) – Whether to approximate, default is True.

  • f (ml_framework, optional) – Machine learning framework. Inferred from inputs if None.

Returns

The input array with leaky relu applied element-wise.


Supported Frameworks:

empty jax_logo empty tf_logo empty pytorch_logo empty mxnet_logo empty numpy_logo empty