ยป Keras API reference / Layers API / Recurrent layers / TimeDistributed layer

TimeDistributed layer

TimeDistributed class

tf.keras.layers.TimeDistributed(layer, **kwargs)

This wrapper allows to apply a layer to every temporal slice of an input.

The input should be at least 3D, and the dimension of index one will be considered to be the temporal dimension.

Consider a batch of 32 video samples, where each sample is a 128x128 RGB image with channels_last data format, across 10 timesteps. The batch input shape is (32, 10, 128, 128, 3).

You can then use TimeDistributed to apply a Conv2D layer to each of the 10 timesteps, independently:

>>> inputs = tf.keras.Input(shape=(10, 128, 128, 3))
>>> conv_2d_layer = tf.keras.layers.Conv2D(64, (3, 3))
>>> outputs = tf.keras.layers.TimeDistributed(conv_2d_layer)(inputs)
>>> outputs.shape
TensorShape([None, 10, 126, 126, 64])

Arguments

  • layer: a tf.keras.layers.Layer instance.

Call arguments

  • inputs: Input tensor.
  • training: Python boolean indicating whether the layer should behave in training mode or in inference mode. This argument is passed to the wrapped layer (only if the layer supports this argument).
  • mask: Binary tensor of shape (samples, timesteps) indicating whether a given timestep should be masked. This argument is passed to the wrapped layer (only if the layer supports this argument).

Raises

  • ValueError: If not initialized with a tf.keras.layers.Layer instance.