Usage of activations
Activations can either be used through an
Activation layer, or through the
activation argument supported by all forward layers:
from keras.layers import Activation, Dense model.add(Dense(64)) model.add(Activation('tanh'))
This is equivalent to:
You can also pass an element-wise Tensorflow/Theano function as an activation:
from keras import backend as K model.add(Dense(64, activation=K.tanh)) model.add(Activation(K.tanh))
Softmax activation function.
x : Tensor. - axis: Integer, axis along which the softmax normalization is applied.
Tensor, output of softmax transformation.
- ValueError: In case
dim(x) == 1.
relu(x, alpha=0.0, max_value=None)
On "Advanced Activations"
Activations that are more complex than a simple Tensorflow/Theano function (eg. learnable activations, which maintain a state) are available as Advanced Activation layers, and can be found in the module
keras.layers.advanced_activations. These include