tf.keras.layers.PReLU( alpha_initializer="zeros", alpha_regularizer=None, alpha_constraint=None, shared_axes=None, **kwargs )
Parametric Rectified Linear Unit.
f(x) = alpha * x for x < 0 f(x) = x for x >= 0
alpha is a learned array with the same shape as x.
Arbitrary. Use the keyword argument
(tuple of integers, does not include the samples axis)
when using this layer as the first layer in a model.
Same shape as the input.
(batch, height, width, channels), and you wish to share parameters across space so that each filter only has one set of parameters, set