gradient_descent_update

ivy.gradient_descent_update(ws, dcdws, lr, inplace=True, stop_gradients=True)[source]

Update weights ws of some function, given the derivatives of some cost c with respect to ws, [dc/dw for w in ws].

Parameters
  • ws (Ivy container) – Weights of the function to be updated.

  • dcdws (Ivy container) – Derivates of the cost c with respect to the weights ws, [dc/dw for w in ws].

  • lr (float or container of layer-wise rates.) – Learning rate(s), the rate(s) at which the weights should be updated relative to the gradient.

  • inplace (bool, optional) – Whether to perform the operation inplace, for backends which support inplace variable updates, and handle gradients behind the scenes such as PyTorch. If the update step should form part of a computation graph (i.e. higher order optimization), then this should be set to False. Default is True.

  • stop_gradients (bool, optional) – Whether to stop the gradients of the variables after each gradient step. Default is True.

Returns

The new function weights ws_new, following the gradient descent updates.


Supported Frameworks:

empty jax_logo empty tf_logo empty pytorch_logo empty mxnet_logo empty numpy_logo empty