About Keras
Getting started
Developer guides
Code examples
Keras 3 API documentation
Models API
Layers API
The base Layer class
Layer activations
Layer weight initializers
Layer weight regularizers
Layer weight constraints
Core layers
Convolution layers
Pooling layers
Recurrent layers
Preprocessing layers
Normalization layers
Regularization layers
Attention layers
Reshaping layers
Merging layers
Activation layers
Backend-specific layers
Callbacks API
Ops API
Optimizers
Metrics
Losses
Data loading
Built-in small datasets
Keras Applications
Mixed precision
Multi-device distribution
RNG API
Utilities
Keras 2 API documentation
KerasTuner: Hyperparam Tuning
KerasHub: Pretrained Models
search
►
Keras 3 API documentation
/
Layers API
/ Activation layers
Activation layers
ReLU layer
Softmax layer
LeakyReLU layer
PReLU layer
ELU layer