About Keras Getting started Code examples Developer guides API reference Models API Layers API Callbacks API Optimizers Metrics Losses Data loading Built-in small datasets Keras Applications Mixed precision Utilities KerasTuner KerasCV KerasNLP Models Tokenizers Preprocessing Layers Modeling Layers Samplers Metrics Keras Core: Keras for TensorFlow, JAX, and PyTorch KerasTuner: Hyperparameter Tuning KerasCV: Computer Vision Workflows KerasNLP: Natural Language Workflows Why choose Keras? Community & governance Contributing to Keras
ยป API reference / KerasNLP / Models / BERT

BERT

Models, tokenizers, and preprocessing layers for BERT, as described in "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding".

For a full list of available presets, see the models page.

BertTokenizer

  • BertTokenizer class
  • from_preset method

BertPreprocessor layer

  • BertPreprocessor class
  • from_preset method
  • tokenizer property

BertBackbone model

  • BertBackbone class
  • from_preset method
  • token_embedding property

BertClassifier model

  • BertClassifier class
  • from_preset method
  • backbone property
  • preprocessor property

BertMaskedLM model

  • BertMaskedLM class
  • from_preset method
  • backbone property
  • preprocessor property

BertMaskedLMPreprocessor layer

  • BertMaskedLMPreprocessor class
  • from_preset method
  • tokenizer property
BERT
BertTokenizer
BertPreprocessor layer
BertBackbone model
BertClassifier model
BertMaskedLM model
BertMaskedLMPreprocessor layer
Terms | Privacy