About Keras
Getting started
Developer guides
Keras 3 API documentation
Keras 2 API documentation
Models API
Layers API
Callbacks API
Optimizers
SGD
RMSprop
Adam
AdamW
Adadelta
Adagrad
Adamax
Adafactor
Nadam
Ftrl
Learning rate schedules API
Metrics
Losses
Data loading
Built-in small datasets
Keras Applications
Mixed precision
Utilities
Code examples
KerasTuner: Hyperparameter Tuning
KerasHub: Pretrained Models
KerasCV: Computer Vision Workflows
KerasNLP: Natural Language Workflows
search
►
Keras 2 API documentation
/
Optimizers
/ Learning rate schedules API
Learning rate schedules API
ExponentialDecay
PiecewiseConstantDecay
PolynomialDecay
InverseTimeDecay
CosineDecay
CosineDecayRestarts