keras.layers.Dropout(rate, noise_shape=None, seed=None, **kwargs)
Applies dropout to the input.
Dropout layer randomly sets input units to 0 with a frequency of
rate at each step during training time, which helps prevent overfitting.
Inputs not set to 0 are scaled up by
1 / (1 - rate) such that the sum over
all inputs is unchanged.
Note that the
Dropout layer only applies when
training is set to
call(), such that no values are dropped during inference.
training will be appropriately set to
automatically. In other contexts, you can set the argument explicitly
True when calling the layer.
(This is in contrast to setting
trainable=False for a
trainable does not affect the layer's behavior, as
not have any variables/weights that can be frozen during training.)
(batch_size, timesteps, features)and you want the dropout mask to be the same for all timesteps, you can use
noise_shape=(batch_size, 1, features).