Keras 2 API documentation / Layers API / Activation layers / ThresholdedReLU layer

ThresholdedReLU layer

[source]

ThresholdedReLU class

tf_keras.layers.ThresholdedReLU(theta=1.0, **kwargs)

Thresholded Rectified Linear Unit.

It follows:

    f(x) = x for x > theta
    f(x) = 0 otherwise`

Input shape

Arbitrary. Use the keyword argument input_shape (tuple of integers, does not include the samples axis) when using this layer as the first layer in a model.

Output shape

Same shape as the input.

Arguments

  • theta: Float >= 0. Threshold location of activation.