Star
About Keras
Getting started
Developer guides
Keras API reference
Models API
Layers API
The base Layer class
Layer activations
Layer weight initializers
Layer weight regularizers
Layer weight constraints
Core layers
Convolution layers
Pooling layers
Recurrent layers
Preprocessing layers
Normalization layers
Regularization layers
Attention layers
Reshaping layers
Merging layers
Activation layers
Callbacks API
Optimizers
Metrics
Losses
Data loading
Built-in small datasets
Keras Applications
Mixed precision
Utilities
KerasTuner
KerasCV
KerasNLP
Code examples
Why choose Keras?
Community & governance
Contributing to Keras
KerasTuner
KerasCV
KerasNLP
search
ยป
Keras API reference
/
Layers API
/ Activation layers
Activation layers
ReLU layer
Softmax layer
LeakyReLU layer
PReLU layer
ELU layer
ThresholdedReLU layer