About Keras
Getting started
Developer guides
Code examples
Keras 3 API documentation
Models API
Layers API
Callbacks API
Ops API
Optimizers
SGD
RMSprop
Adam
AdamW
Adadelta
Adagrad
Adamax
Adafactor
Nadam
Ftrl
Lion
Lamb
Loss Scale Optimizer
Learning rate schedules API
Metrics
Losses
Data loading
Built-in small datasets
Keras Applications
Mixed precision
Multi-device distribution
RNG API
Utilities
Keras 2 API documentation
KerasTuner: Hyperparam Tuning
KerasHub: Pretrained Models
search
►
Keras 3 API documentation
/
Optimizers
/ Learning rate schedules API
Learning rate schedules API
LearningRateSchedule
ExponentialDecay
PiecewiseConstantDecay
PolynomialDecay
InverseTimeDecay
CosineDecay
CosineDecayRestarts