AlphaDropout
classkeras.layers.AlphaDropout(rate, noise_shape=None, seed=None, **kwargs)
Applies Alpha Dropout to the input.
Alpha Dropout is a Dropout
that keeps mean and variance of inputs
to their original values, in order to ensure the self-normalizing property
even after this dropout.
Alpha Dropout fits well to Scaled Exponential Linear Units (SELU) by
randomly setting activations to the negative saturation value.
Arguments
sqrt(rate / (1 - rate))
.(batch_size, timesteps, features)
and
you want the alpha dropout mask to be the same for all timesteps,
you can use noise_shape=(batch_size, 1, features)
.Call arguments