About Keras Getting started Code examples Developer guides API reference Models API Layers API The base Layer class Layer activations Layer weight initializers Layer weight regularizers Layer weight constraints Core layers Convolution layers Pooling layers Recurrent layers Preprocessing layers Normalization layers Regularization layers Attention layers Reshaping layers Merging layers Activation layers Callbacks API Optimizers Metrics Losses Data loading Built-in small datasets Keras Applications Mixed precision Utilities KerasTuner KerasCV KerasNLP Keras Core: Keras for TensorFlow, JAX, and PyTorch KerasTuner: Hyperparameter Tuning KerasCV: Computer Vision Workflows KerasNLP: Natural Language Workflows Why choose Keras? Community & governance Contributing to Keras
ยป API reference / Layers API / Activation layers

Activation layers

  • ReLU layer
  • Softmax layer
  • LeakyReLU layer
  • PReLU layer
  • ELU layer
  • ThresholdedReLU layer
Terms | Privacy