ELU layer

[source]

ELU class

tf_keras.layers.ELU(alpha=1.0, **kwargs)

Exponential Linear Unit.

It follows:

    f(x) =  alpha * (exp(x) - 1.) for x < 0
    f(x) = x for x >= 0

Input shape

Arbitrary. Use the keyword argument input_shape (tuple of integers, does not include the samples axis) when using this layer as the first layer in a model.

Output shape

Same shape as the input.

Arguments

  • alpha: Scale for the negative factor.