Keras 2 API documentation / Models API / Saving & serialization / Whole model saving & loading

Whole model saving & loading

[source]

save method

Model.save(filepath, overwrite=True, save_format=None, **kwargs)

Saves a model as a TensorFlow SavedModel or HDF5 file.

See the Serialization and Saving guide for details.

Arguments

  • model: TF-Keras model instance to be saved.
  • filepath: str or pathlib.Path object. Path where to save the model.
  • overwrite: Whether we should overwrite any existing model at the target location, or instead ask the user via an interactive prompt.
  • save_format: Either "keras", "tf", "h5", indicating whether to save the model in the native TF-Keras format (.keras), in the TensorFlow SavedModel format (referred to as "SavedModel" below), or in the legacy HDF5 format (.h5). Defaults to "tf" in TF 2.X, and "h5" in TF 1.X.

SavedModel format arguments: include_optimizer: Only applied to SavedModel and legacy HDF5 formats. If False, do not save the optimizer state. Defaults to True. signatures: Only applies to SavedModel format. Signatures to save with the SavedModel. See the signatures argument in tf.saved_model.save for details. options: Only applies to SavedModel format. tf.saved_model.SaveOptions object that specifies SavedModel saving options. save_traces: Only applies to SavedModel format. When enabled, the SavedModel will store the function traces for each layer. This can be disabled, so that only the configs of each layer are stored. Defaults to True. Disabling this will decrease serialization time and reduce file size, but it requires that all custom layers/models implement a get_config() method.

Example

model = tf.keras.Sequential([
    tf.keras.layers.Dense(5, input_shape=(3,)),
    tf.keras.layers.Softmax()])
model.save("model.keras")
loaded_model = tf.keras.models.load_model("model.keras")
x = tf.random.uniform((10, 3))
assert np.allclose(model.predict(x), loaded_model.predict(x))

Note that model.save() is an alias for tf.keras.models.save_model().


[source]

save_model function

tf_keras.saving.save_model(
    model, filepath, overwrite=True, save_format=None, **kwargs
)

Saves a model as a TensorFlow SavedModel or HDF5 file.

See the Serialization and Saving guide for details.

Arguments

  • model: TF-Keras model instance to be saved.
  • filepath: str or pathlib.Path object. Path where to save the model.
  • overwrite: Whether we should overwrite any existing model at the target location, or instead ask the user via an interactive prompt.
  • save_format: Either "keras", "tf", "h5", indicating whether to save the model in the native TF-Keras format (.keras), in the TensorFlow SavedModel format (referred to as "SavedModel" below), or in the legacy HDF5 format (.h5). Defaults to "tf" in TF 2.X, and "h5" in TF 1.X.

SavedModel format arguments: include_optimizer: Only applied to SavedModel and legacy HDF5 formats. If False, do not save the optimizer state. Defaults to True. signatures: Only applies to SavedModel format. Signatures to save with the SavedModel. See the signatures argument in tf.saved_model.save for details. options: Only applies to SavedModel format. tf.saved_model.SaveOptions object that specifies SavedModel saving options. save_traces: Only applies to SavedModel format. When enabled, the SavedModel will store the function traces for each layer. This can be disabled, so that only the configs of each layer are stored. Defaults to True. Disabling this will decrease serialization time and reduce file size, but it requires that all custom layers/models implement a get_config() method.

Example

model = tf.keras.Sequential([
    tf.keras.layers.Dense(5, input_shape=(3,)),
    tf.keras.layers.Softmax()])
model.save("model.keras")
loaded_model = tf.keras.saving.load_model("model.keras")
x = tf.random.uniform((10, 3))
assert np.allclose(model.predict(x), loaded_model.predict(x))

Note that model.save() is an alias for tf.keras.saving.save_model().

The SavedModel or HDF5 file contains:

  • The model's configuration (architecture)
  • The model's weights
  • The model's optimizer's state (if any)

Thus models can be reinstantiated in the exact same state, without any of the code used for model definition or training.

Note that the model weights may have different scoped names after being loaded. Scoped names include the model/layer names, such as "dense_1/kernel:0". It is recommended that you use the layer properties to access specific variables, e.g. model.get_layer("dense_1").kernel.

SavedModel serialization format

With save_format="tf", the model and all trackable objects attached to the it (e.g. layers and variables) are saved as a TensorFlow SavedModel. The model config, weights, and optimizer are included in the SavedModel. Additionally, for every TF-Keras layer attached to the model, the SavedModel stores:

  • The config and metadata – e.g. name, dtype, trainable status
  • Traced call and loss functions, which are stored as TensorFlow subgraphs.

The traced functions allow the SavedModel format to save and load custom layers without the original class definition.

You can choose to not save the traced functions by disabling the save_traces option. This will decrease the time it takes to save the model and the amount of disk space occupied by the output SavedModel. If you enable this option, then you must provide all custom class definitions when loading the model. See the custom_objects argument in tf.keras.saving.load_model.


[source]

load_model function

tf_keras.saving.load_model(
    filepath, custom_objects=None, compile=True, safe_mode=True, **kwargs
)

Loads a model saved via model.save().

Arguments

  • filepath: str or pathlib.Path object, path to the saved model file.
  • custom_objects: Optional dictionary mapping names (strings) to custom classes or functions to be considered during deserialization.
  • compile: Boolean, whether to compile the model after loading.
  • safe_mode: Boolean, whether to disallow unsafe lambda deserialization. When safe_mode=False, loading an object has the potential to trigger arbitrary code execution. This argument is only applicable to the TF-Keras v3 model format. Defaults to True.

SavedModel format arguments: options: Only applies to SavedModel format. Optional tf.saved_model.LoadOptions object that specifies SavedModel loading options.

Returns

A TF-Keras model instance. If the original model was compiled, and the argument compile=True is set, then the returned model will be compiled. Otherwise, the model will be left uncompiled.

Example

model = tf.keras.Sequential([
    tf.keras.layers.Dense(5, input_shape=(3,)),
    tf.keras.layers.Softmax()])
model.save("model.keras")
loaded_model = tf.keras.saving.load_model("model.keras")
x = tf.random.uniform((10, 3))
assert np.allclose(model.predict(x), loaded_model.predict(x))

Note that the model variables may have different name values (var.name property, e.g. "dense_1/kernel:0") after being reloaded. It is recommended that you use layer attributes to access specific variables, e.g. model.get_layer("dense_1").kernel.