TimeDistributed
classtf_keras.layers.TimeDistributed(layer, **kwargs)
This wrapper allows to apply a layer to every temporal slice of an input.
Every input should be at least 3D, and the dimension of index one of the first input will be considered to be the temporal dimension.
Consider a batch of 32 video samples, where each sample is a 128x128 RGB
image with channels_last
data format, across 10 timesteps.
The batch input shape is (32, 10, 128, 128, 3)
.
You can then use TimeDistributed
to apply the same Conv2D
layer to each
of the 10 timesteps, independently:
>>> inputs = tf.keras.Input(shape=(10, 128, 128, 3))
>>> conv_2d_layer = tf.keras.layers.Conv2D(64, (3, 3))
>>> outputs = tf.keras.layers.TimeDistributed(conv_2d_layer)(inputs)
>>> outputs.shape
TensorShape([None, 10, 126, 126, 64])
Because TimeDistributed
applies the same instance of Conv2D
to each of
the timestamps, the same set of weights are used at each timestamp.
Arguments
tf.keras.layers.Layer
instance.Call arguments
(samples, timesteps)
indicating whether
a given timestep should be masked. This argument is passed to the
wrapped layer (only if the layer supports this argument).Raises
tf.keras.layers.Layer
instance.