None
Getting started Developer guides Code examples Keras 3 API documentation Keras 2 API documentation KerasTuner: Hyperparam Tuning KerasHub: Pretrained Models Getting started Developer guides API documentation Modeling API Model Architectures Tokenizers Preprocessing Layers Modeling Layers Samplers Metrics Pretrained models list KerasRS
keras.io logo
  • Get started
  • Guides
  • API
  • Examples
  • Keras Tuner
  • Keras RS
  • Keras Hub
KerasHub: Pretrained Models
Getting started Developer guides API documentation Modeling API Model Architectures Tokenizers Preprocessing Layers Modeling Layers Samplers Metrics Pretrained models list
► KerasHub: Pretrained Models / API documentation / KerasHub Modeling Layers

KerasHub Modeling Layers

KerasHub modeling layers are give keras.layers.Layer implementations for building blocks common to pretrained models. They can be used to create a new model from scratch, or to extend a pretrained model.

TransformerEncoder layer

  • TransformerEncoder class
  • call method

TransformerDecoder layer

  • TransformerDecoder class
  • call method

FNetEncoder layer

  • FNetEncoder class

PositionEmbedding layer

  • PositionEmbedding class

RotaryEmbedding layer

  • RotaryEmbedding class

SinePositionEncoding layer

  • SinePositionEncoding class

ReversibleEmbedding layer

  • ReversibleEmbedding class

TokenAndPositionEmbedding layer

  • TokenAndPositionEmbedding class

AlibiBias layer

  • AlibiBias class

MaskedLMHead layer

  • MaskedLMHead class

CachedMultiHeadAttention layer

  • CachedMultiHeadAttention class
KerasHub Modeling Layers
TransformerEncoder layer
TransformerDecoder layer
FNetEncoder layer
PositionEmbedding layer
RotaryEmbedding layer
SinePositionEncoding layer
ReversibleEmbedding layer
TokenAndPositionEmbedding layer
AlibiBias layer
MaskedLMHead layer
CachedMultiHeadAttention layer
Terms
|
Privacy