TokenAndPositionEmbedding
classkeras_nlp.layers.TokenAndPositionEmbedding(
vocabulary_size,
sequence_length,
embedding_dim,
tie_weights=True,
embeddings_initializer="uniform",
mask_zero=False,
**kwargs
)
A layer which sums a token and position embedding.
Token and position embeddings are ways of representing words and their order
in a sentence. This layer creates a keras.layers.Embedding
token embedding
and a keras_hub.layers.PositionEmbedding
position embedding and sums their
output when called. This layer assumes that the last dimension in the input
corresponds to the sequence dimension.
Arguments
reverse
projection should share the same
weights.keras.layers.Layer
,
including name
, trainable
, dtype
etc.Example
inputs = np.ones(shape=(1, 50), dtype="int32")
embedding_layer = keras_hub.layers.TokenAndPositionEmbedding(
vocabulary_size=10_000,
sequence_length=50,
embedding_dim=128,
)
outputs = embedding_layer(inputs)