About Keras Getting started Code examples Developer guides API reference Models API Layers API Callbacks API Optimizers Metrics Losses Data loading Built-in small datasets Keras Applications Mixed precision Utilities KerasTuner KerasCV KerasNLP Models Tokenizers Preprocessing Layers Modeling Layers Samplers Metrics Keras Core: Keras for TensorFlow, JAX, and PyTorch KerasTuner: Hyperparameter Tuning KerasCV: Computer Vision Workflows KerasNLP: Natural Language Workflows Why choose Keras? Community & governance Contributing to Keras
ยป API reference / KerasNLP

KerasNLP

KerasNLP is a toolbox of modular building blocks ranging from pretrained state-of-the-art models, to low-level Transformer Encoder layers. For an introduction to the library see the KerasNLP home page. For a high-level introduction to the API see our getting started guide.

Models

  • Albert
  • Bert
  • DebertaV3
  • DistilBert
  • GPT2
  • FNet
  • OPT
  • Roberta
  • XLMRoberta

Tokenizers

  • Tokenizer base class
  • WordPieceTokenizer
  • SentencePieceTokenizer
  • BytePairTokenizer
  • ByteTokenizer
  • UnicodeCodepointTokenizer
  • compute_word_piece_vocabulary function
  • compute_sentence_piece_proto function

Preprocessing Layers

  • StartEndPacker layer
  • MultiSegmentPacker layer
  • RandomSwap layer
  • RandomDeletion layer
  • MaskedLMMaskGenerator layer

Modeling Layers

  • TransformerEncoder layer
  • TransformerDecoder layer
  • FNetEncoder layer
  • PositionEmbedding layer
  • SinePositionEncoding layer
  • TokenAndPositionEmbedding layer
  • MaskedLMHead layer
  • CachedMultiHeadAttention layer

Samplers

  • Sampler base class
  • BeamSampler
  • ContrastiveSampler
  • GreedySampler
  • RandomSampler
  • TopKSampler
  • TopPSampler

Metrics

  • Perplexity metric
  • RougeL metric
  • RougeN metric
  • Bleu metric
  • EditDistance metric
KerasNLP
Models
Tokenizers
Preprocessing Layers
Modeling Layers
Samplers
Metrics
Terms | Privacy