Star
About Keras
Getting started
Developer guides
Keras API reference
Models API
Layers API
Callbacks API
Optimizers
SGD
RMSprop
Adam
AdamW
Adadelta
Adagrad
Adamax
Adafactor
Nadam
Ftrl
Learning rate schedules API
Metrics
Losses
Data loading
Built-in small datasets
Keras Applications
Mixed precision
Utilities
KerasTuner
KerasCV
KerasNLP
Code examples
Why choose Keras?
Community & governance
Contributing to Keras
KerasTuner
KerasCV
KerasNLP
search
ยป
Keras API reference
/
Optimizers
/ Learning rate schedules API
Learning rate schedules API
ExponentialDecay
PiecewiseConstantDecay
PolynomialDecay
InverseTimeDecay
CosineDecay
CosineDecayRestarts