SGD

class ivy.neural_net_stateful.optimizers.SGD(lr=<function SGD.<lambda>>, inplace=True, stop_gradients=True)[source]

Bases: ivy.neural_net_stateful.optimizers.Optimizer

__init__(lr=<function SGD.<lambda>>, inplace=True, stop_gradients=True)[source]

Construct a Stochastic-Gradient-Descent (SGD) optimizer.

Parameters
  • lr (float, optional) – Learning rate, default is 1e-4.

  • inplace (bool, optional) – Whether to update the variables in-place, or to create new variable handles. This is only relevant for frameworks with stateful variables such as PyTorch. Default is True.

  • stop_gradients (bool, optional) – Whether to stop the gradients of the variables after each gradient step. Default is True.

set_state(state)[source]

Set state of the optimizer.

Parameters

state (Ivy container of state tensors) – Nested state to update.

property state

Supported Frameworks:

empty jax_logo empty tf_logo empty pytorch_logo empty mxnet_logo empty numpy_logo empty