None
Getting started Developer guides Code examples Keras 3 API documentation Models API The Model class The Sequential class Model training APIs Saving & serialization Knowledge distillation Layers API Callbacks API Ops API Optimizers Metrics Losses Data loading Built-in small datasets Keras Applications Mixed precision Multi-device distribution RNG API Rematerialization Utilities Keras 2 API documentation KerasTuner: Hyperparam Tuning KerasHub: Pretrained Models KerasRS
keras.io logo
  • Get started
  • Guides
  • API
  • Examples
  • Keras Tuner
  • Keras RS
  • Keras Hub
Keras 3 API documentation
Models API The Model class The Sequential class Model training APIs Saving & serialization Knowledge distillation Layers API Callbacks API Ops API Optimizers Metrics Losses Data loading Built-in small datasets Keras Applications Mixed precision Multi-device distribution RNG API Rematerialization Utilities
Keras 2 API documentation
► Keras 3 API documentation / Models API / Knowledge distillation

Knowledge distillation

  • Distiller model
  • Base distillation loss
  • Logits distillation loss
  • Feature distillation loss
Terms
|
Privacy