tf.compat.v1.keras.experimental.export_saved_model
Stay organized with collections
Save and categorize content based on your preferences.
Exports a tf.keras.Model
as a Tensorflow SavedModel.
tf.compat.v1.keras.experimental.export_saved_model(
model, saved_model_path, custom_objects=None, as_text=False,
input_signature=None, serving_only=False
)
Note that at this time, subclassed models can only be saved using
serving_only=True
.
The exported SavedModel
is a standalone serialization of Tensorflow objects,
and is supported by TF language APIs and the Tensorflow Serving system.
To load the model, use the function
tf.keras.experimental.load_from_saved_model
.
The SavedModel
contains:
- a checkpoint containing the model weights.
- a
SavedModel
proto containing the Tensorflow backend graph. Separate
graphs are saved for prediction (serving), train, and evaluation. If
the model has not been compiled, then only the graph computing predictions
will be exported.
- the model's json config. If the model is subclassed, this will only be
included if the model's
get_config()
method is overwritten.
Example:
import tensorflow as tf
# Create a tf.keras model.
model = tf.keras.Sequential()
model.add(tf.keras.layers.Dense(1, input_shape=[10]))
model.summary()
# Save the tf.keras model in the SavedModel format.
path = '/tmp/simple_keras_model'
tf.keras.experimental.export_saved_model(model, path)
# Load the saved keras model back.
new_model = tf.keras.experimental.load_from_saved_model(path)
new_model.summary()
Args |
model
|
A tf.keras.Model to be saved. If the model is subclassed, the flag
serving_only must be set to True.
|
saved_model_path
|
a string specifying the path to the SavedModel directory.
|
custom_objects
|
Optional dictionary mapping string names to custom classes
or functions (e.g. custom loss functions).
|
as_text
|
bool, False by default. Whether to write the SavedModel proto
in text format. Currently unavailable in serving-only mode.
|
input_signature
|
A possibly nested sequence of tf.TensorSpec objects, used
to specify the expected model inputs. See tf.function for more details.
|
serving_only
|
bool, False by default. When this is true, only the
prediction graph is saved.
|
Raises |
NotImplementedError
|
If the model is a subclassed model, and serving_only is
False.
|
ValueError
|
If the input signature cannot be inferred from the model.
|
AssertionError
|
If the SavedModel directory already exists and isn't empty.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2021-05-14 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2021-05-14 UTC."],[],[],null,["# tf.compat.v1.keras.experimental.export_saved_model\n\n\u003cbr /\u003e\n\n|---------------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v2.5.0/tensorflow/python/keras/saving/saved_model_experimental.py#L63-L145) |\n\nExports a [`tf.keras.Model`](../../../../../tf/keras/Model) as a Tensorflow SavedModel. \n\n tf.compat.v1.keras.experimental.export_saved_model(\n model, saved_model_path, custom_objects=None, as_text=False,\n input_signature=None, serving_only=False\n )\n\nNote that at this time, subclassed models can only be saved using\n`serving_only=True`.\n\nThe exported `SavedModel` is a standalone serialization of Tensorflow objects,\nand is supported by TF language APIs and the Tensorflow Serving system.\nTo load the model, use the function\n`tf.keras.experimental.load_from_saved_model`.\n\nThe `SavedModel` contains:\n\n1. a checkpoint containing the model weights.\n2. a `SavedModel` proto containing the Tensorflow backend graph. Separate graphs are saved for prediction (serving), train, and evaluation. If the model has not been compiled, then only the graph computing predictions will be exported.\n3. the model's json config. If the model is subclassed, this will only be included if the model's `get_config()` method is overwritten.\n\n#### Example:\n\n import tensorflow as tf\n\n # Create a tf.keras model.\n model = tf.keras.Sequential()\n model.add(tf.keras.layers.Dense(1, input_shape=[10]))\n model.summary()\n\n # Save the tf.keras model in the SavedModel format.\n path = '/tmp/simple_keras_model'\n tf.keras.experimental.export_saved_model(model, path)\n\n # Load the saved keras model back.\n new_model = tf.keras.experimental.load_from_saved_model(path)\n new_model.summary()\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|--------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `model` | A [`tf.keras.Model`](../../../../../tf/keras/Model) to be saved. If the model is subclassed, the flag `serving_only` must be set to True. |\n| `saved_model_path` | a string specifying the path to the SavedModel directory. |\n| `custom_objects` | Optional dictionary mapping string names to custom classes or functions (e.g. custom loss functions). |\n| `as_text` | bool, `False` by default. Whether to write the `SavedModel` proto in text format. Currently unavailable in serving-only mode. |\n| `input_signature` | A possibly nested sequence of [`tf.TensorSpec`](../../../../../tf/TensorSpec) objects, used to specify the expected model inputs. See [`tf.function`](../../../../../tf/function) for more details. |\n| `serving_only` | bool, `False` by default. When this is true, only the prediction graph is saved. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Raises ------ ||\n|-----------------------|----------------------------------------------------------------|\n| `NotImplementedError` | If the model is a subclassed model, and serving_only is False. |\n| `ValueError` | If the input signature cannot be inferred from the model. |\n| `AssertionError` | If the SavedModel directory already exists and isn't empty. |\n\n\u003cbr /\u003e"]]