View source on GitHub |
This wrapper allows to apply a layer to every temporal slice of an input.
Inherits From: Wrapper
, Layer
, Operation
tf.keras.layers.TimeDistributed(
layer, **kwargs
)
Used in the notebooks
Used in the tutorials |
---|
Every input should be at least 3D, and the dimension of index one of the first input will be considered to be the temporal dimension.
Consider a batch of 32 video samples, where each sample is a 128x128 RGB
image with channels_last
data format, across 10 timesteps.
The batch input shape is (32, 10, 128, 128, 3)
.
You can then use TimeDistributed
to apply the same Conv2D
layer to each
of the 10 timesteps, independently:
inputs = layers.Input(shape=(10, 128, 128, 3), batch_size=32)
conv_2d_layer = layers.Conv2D(64, (3, 3))
outputs = layers.TimeDistributed(conv_2d_layer)(inputs)
outputs.shape
(32, 10, 126, 126, 64)
Because TimeDistributed
applies the same instance of Conv2D
to each of
the timestamps, the same set of weights are used at each timestamp.
Args | |
---|---|
layer
|
a keras.layers.Layer instance.
|
Methods
from_config
@classmethod
from_config( config, custom_objects=None )
Creates a layer from its config.
This method is the reverse of get_config
,
capable of instantiating the same layer from the config
dictionary. It does not handle layer connectivity
(handled by Network), nor weights (handled by set_weights
).
Args | |
---|---|
config
|
A Python dictionary, typically the output of get_config. |
Returns | |
---|---|
A layer instance. |
symbolic_call
symbolic_call(
*args, **kwargs
)