lars_update

ivy.lars_update(ws, dcdws, lr, decay_lambda=0, inplace=True, stop_gradients=True)[source]

Update weights ws of some function, given the derivatives of some cost c with respect to ws, [dc/dw for w in ws], by applying Layerwise Adaptive Rate Scaling (LARS) method.

Parameters
  • ws (Ivy container) – Weights of the function to be updated.

  • dcdws (Ivy container) – Derivates of the cost c with respect to the weights ws, [dc/dw for w in ws].

  • lr (float) – Learning rate, the rate at which the weights should be updated relative to the gradient.

  • decay_lambda (float) – The factor used for weight decay. Default is zero.

  • inplace (bool, optional) – Whether to perform the operation inplace, for backends which support inplace variable updates, and handle gradients behind the scenes such as PyTorch. If the update step should form part of a computation graph (i.e. higher order optimization), then this should be set to False. Default is True.

  • stop_gradients (bool, optional) – Whether to stop the gradients of the variables after each gradient step. Default is True.

Returns

The new function weights ws_new, following the LARS updates.


Supported Frameworks:

empty jax_logo empty tf_logo empty pytorch_logo empty mxnet_logo empty numpy_logo empty