PReLU classkeras.layers.PReLU(
alpha_initializer="Zeros",
alpha_regularizer=None,
alpha_constraint=None,
shared_axes=None,
**kwargs
)
Parametric Rectified Linear Unit activation layer.
Formula:
f(x) = alpha * x for x < 0
f(x) = x for x >= 0
where alpha is a learned array with the same shape as x.
Arguments
(batch, height, width, channels), and you wish to share parameters
across space so that each filter only has one set of parameters,
set shared_axes=[1, 2].name and dtype.