Optimizers
Optimizers
[class] AbstractOptimizer
AbstractOptimizer
is the base class for all optimizers. One AbstractOptimizer
corresponds to one Tensor
class with :requires-grad=T
. (optimizer-param optimizer)
to get the corresponding tensor.
[generic] step-optimizer
NIL
[function] hook-optimizers
This function is used to hook the optimizers in the recognised parameters in avm-params-to-optimize. hooker is an function that takes one argument, which is the tensor that requires-grad=T, returns the AbstractOptimizer.
A list of created optimizers are returned.
[function] zero-grad
Fills the gradient of the optimizer with zeros.[class] SGD
Implements SGD Optimizer:
where the initarg :lr
is the learning rate.