About Keras
Getting started
Developer guides
Code examples
Keras 3 API documentation
Keras 2 API documentation
Models API
Layers API
Callbacks API
Optimizers
SGD
RMSprop
Adam
AdamW
Adadelta
Adagrad
Adamax
Adafactor
Nadam
Ftrl
Learning rate schedules API
Metrics
Losses
Data loading
Built-in small datasets
Keras Applications
Mixed precision
Utilities
KerasTuner: Hyperparam Tuning
KerasHub: Pretrained Models
search
►
Keras 2 API documentation
/
Optimizers
/ Learning rate schedules API
Learning rate schedules API
ExponentialDecay
PiecewiseConstantDecay
PolynomialDecay
InverseTimeDecay
CosineDecay
CosineDecayRestarts