Optimizers
[source]
apply_gradients
method
Optimizer.apply_gradients(
grads_and_vars, name=None, skip_gradients_aggregation=False, **kwargs
)
Apply gradients to variables.
Arguments
- grads_and_vars: List of
(gradient, variable)
pairs.
- name: string, defaults to None. The name of the namescope to
use when creating variables. If None,
self.name
will be used.
- skip_gradients_aggregation: If true, gradients aggregation will not be
performed inside optimizer. Usually this arg is set to True when you
write custom code aggregating gradients outside the optimizer.
- **kwargs: keyword arguments only used for backward compatibility.
Returns
A tf.Variable
, representing the current iteration.
Raises
- TypeError: If
grads_and_vars
is malformed.
- RuntimeError: If called in a cross-replica context.
variables
property
tf_keras.optimizers.Optimizer.variables
Returns variables of this optimizer.