Keras 3 API documentation / KerasNLP / Models / DebertaV3 / DebertaV3MaskedLM model

DebertaV3MaskedLM model

[source]

DebertaV3MaskedLM class

keras_nlp.models.DebertaV3MaskedLM(backbone, preprocessor=None, **kwargs)

An end-to-end DeBERTaV3 model for the masked language modeling task.

This model will train DeBERTaV3 on a masked language modeling task. The model will predict labels for a number of masked tokens in the input data. For usage of this model with pre-trained weights, see the from_preset() method.

This model can optionally be configured with a preprocessor layer, in which case inputs can be raw string features during fit(), predict(), and evaluate(). Inputs will be tokenized and dynamically masked during training and evaluation. This is done by default when creating the model with from_preset().

Disclaimer: Pre-trained models are provided on an "as is" basis, without warranties or conditions of any kind. The underlying model is provided by a third party and subject to a separate license, available here.

Arguments

Example usage:

Raw string data.

features = ["The quick brown fox jumped.", "I forgot my homework."]

# Pretrained language model.
masked_lm = keras_nlp.models.DebertaV3MaskedLM.from_preset(
    "deberta_v3_base_en",
)
masked_lm.fit(x=features, batch_size=2)

# Re-compile (e.g., with a new learning rate).
masked_lm.compile(
    loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True),
    optimizer=keras.optimizers.Adam(5e-5),
    jit_compile=True,
)
# Access backbone programmatically (e.g., to change `trainable`).
masked_lm.backbone.trainable = False
# Fit again.
masked_lm.fit(x=features, batch_size=2)

Preprocessed integer data.

# Create preprocessed batch where 0 is the mask token.
features = {
    "token_ids": np.array([[1, 2, 0, 4, 0, 6, 7, 8]] * 2),
    "padding_mask": np.array([[1, 1, 1, 1, 1, 1, 1, 1]] * 2),
    "mask_positions": np.array([[2, 4]] * 2),
}
# Labels are the original masked values.
labels = [[3, 5]] * 2

masked_lm = keras_nlp.models.DebertaV3MaskedLM.from_preset(
    "deberta_v3_base_en",
    preprocessor=None,
)
masked_lm.fit(x=features, y=labels, batch_size=2)

[source]

from_preset method

DebertaV3MaskedLM.from_preset()

Instantiate DebertaV3MaskedLM model from preset architecture and weights.

Arguments

  • preset: string. Must be one of "deberta_v3_extra_small_en", "deberta_v3_small_en", "deberta_v3_base_en", "deberta_v3_large_en", "deberta_v3_base_multi".
  • load_weights: Whether to load pre-trained weights into model. Defaults to True.

Examples

# Load architecture and weights from preset
model = DebertaV3MaskedLM.from_preset("deberta_v3_extra_small_en")

# Load randomly initialized model from preset architecture
model = DebertaV3MaskedLM.from_preset(
    "deberta_v3_extra_small_en",
    load_weights=False
)
Preset name Parameters Description
deberta_v3_extra_small_en 70.68M 12-layer DeBERTaV3 model where case is maintained. Trained on English Wikipedia, BookCorpus and OpenWebText.
deberta_v3_small_en 141.30M 6-layer DeBERTaV3 model where case is maintained. Trained on English Wikipedia, BookCorpus and OpenWebText.
deberta_v3_base_en 183.83M 12-layer DeBERTaV3 model where case is maintained. Trained on English Wikipedia, BookCorpus and OpenWebText.
deberta_v3_large_en 434.01M 24-layer DeBERTaV3 model where case is maintained. Trained on English Wikipedia, BookCorpus and OpenWebText.
deberta_v3_base_multi 278.22M 12-layer DeBERTaV3 model where case is maintained. Trained on the 2.5TB multilingual CC100 dataset.

backbone property

keras_nlp.models.DebertaV3MaskedLM.backbone

A keras.Model instance providing the backbone sub-model.


preprocessor property

keras_nlp.models.DebertaV3MaskedLM.preprocessor

A keras.layers.Layer instance used to preprocess inputs.