class ivy.neural_net_stateful.optimizers.Optimizer(lr, compile_step=False, inplace=True, stop_gradients=True)[source]

Bases: abc.ABC

__init__(lr, compile_step=False, inplace=True, stop_gradients=True)[source]

Construct an general Optimizer. This is an abstract class, and must be derived.

  • lr (function or float.) – Learning rate.

  • compile_step (bool, optional) – Whether to compile the optimizer step, default is False.

  • inplace (bool, optional) – Whether to update the variables in-place, or to create new variable handles. This is only relevant for frameworks with stateful variables such as PyTorch. Default is True.

  • stop_gradients (bool, optional) – Whether to stop the gradients of the variables after each gradient step. Default is True.

abstract set_state(state)[source]

Set state of the optimizer.


state (Ivy container of state tensors) – Nested state to update.

step(v, grads, ignore_missing=False)[source]

Update nested variables container v from possibly compiled overriden private self._step_fn

  • v (Ivy container of variables) – Nested variables to update.

  • grads (sequence of arrays) – Nested gradients to update.

  • ignore_missing (bool, optional) – Whether to ignore keys missing from the gradients which exist in the variables. Default is False.


The updated variables, following update step.

Supported Frameworks:

empty jax_logo empty tf_logo empty pytorch_logo empty mxnet_logo empty numpy_logo empty