LayerNorm

class ivy.neural_net_stateful.norms.LayerNorm(normalized_shape, epsilon=None, elementwise_affine=True, new_std=None, dev_str='cpu', v=None)[source]

Bases: ivy.neural_net_stateful.module.Module

__init__(normalized_shape, epsilon=None, elementwise_affine=True, new_std=None, dev_str='cpu', v=None)[source]

Class for applying Layer Normalization over a mini-batch of inputs

Parameters
  • normalized_shape (int or sequence of ints) – Trailing shape to applying the normalization to.

  • epsilon (float, optional) – small constant to add to the denominator, use global ivy._MIN_BASE by default.

  • elementwise_affine (bool, optional) – Whether to include learnable affine parameters, default is True.

  • new_std (float, optional) – The standard deviation of the new normalized values. Default is 1.

  • dev_str (str, optional) – device on which to create the layer’s variables ‘cuda:0’, ‘cuda:1’, ‘cpu’ etc.

  • v (ivy container of variables, optional) – the variables for each submodule in the sequence, constructed internally by default.


Supported Frameworks:

empty jax_logo empty tf_logo empty pytorch_logo empty mxnet_logo empty numpy_logo empty