View source on GitHub |
Pipeline using model.fit
to train a ranking tf.keras.Model
.
Inherits From: AbstractPipeline
tfr.keras.pipeline.ModelFitPipeline(
model_builder: tfr.keras.model.AbstractModelBuilder
,
dataset_builder: tfr.keras.pipeline.AbstractDatasetBuilder
,
hparams: tfr.keras.pipeline.PipelineHparams
)
The ModelFitPipeline
class is an abstract class inherit from
AbstractPipeline
to train and validate a ranking model
with model.fit
in a distributed strategy specified in hparams.
To be implemented by subclasses:
build_loss()
: Contains the logic to build atf.keras.losses.Loss
or a dict or list oftf.keras.losses.Loss
s to be optimized in training.build_metrics()
: Contains the logic to build a list or dict oftf.keras.metrics.Metric
s to monitor and evaluate the training.build_weighted_metrics()
: Contains the logic to build a list or dict oftf.keras.metrics.Metric
s which will take the weights.
Example subclass implementation:
class BasicModelFitPipeline(ModelFitPipeline):
def build_loss(self):
return tfr.keras.losses.get('softmax_loss')
def build_metrics(self):
return [
tfr.keras.metrics.get(
'ndcg', topn=topn, name='ndcg_{}'.format(topn)
) for topn in [1, 5, 10]
]
def build_weighted_metrics(self):
return [
tfr.keras.metrics.get(
'ndcg', topn=topn, name='weighted_ndcg_{}'.format(topn)
) for topn in [1, 5, 10]
]
Methods
build_callbacks
build_callbacks() -> List[tf.keras.callbacks.Callback]
Sets up Callbacks.
Example usage:
model_builder = ModelBuilder(...)
dataset_builder = DatasetBuilder(...)
hparams = PipelineHparams(...)
pipeline = BasicModelFitPipeline(model_builder, dataset_builder, hparams)
callbacks = pipeline.build_callbacks()
Returns | |
---|---|
A list of tf.keras.callbacks.Callback or a
tf.keras.callbacks.CallbackList for tensorboard and checkpoint.
|
build_loss
@abc.abstractmethod
build_loss() -> Any
Returns the loss for model.compile.
Example usage:
pipeline = BasicPipeline(model, train_data, valid_data)
loss = pipeline.build_loss()
Returns | |
---|---|
A tf.keras.losses.Loss or a dict or list of tf.keras.losses.Loss .
|
build_metrics
@abc.abstractmethod
build_metrics() -> Any
Returns a list of ranking metrics for model.compile()
.
Example usage:
pipeline = BasicPipeline(model, train_data, valid_data)
metrics = pipeline.build_metrics()
Returns | |
---|---|
A list or a dict of tf.keras.metrics.Metric s.
|
build_weighted_metrics
@abc.abstractmethod
build_weighted_metrics() -> Any
Returns a list of weighted ranking metrics for model.compile.
Example usage:
pipeline = BasicPipeline(model, train_data, valid_data)
weighted_metrics = pipeline.build_weighted_metrics()
Returns | |
---|---|
A list or a dict of tf.keras.metrics.Metric s.
|
export_saved_model
export_saved_model(
model: tf.keras.Model,
export_to: str,
checkpoint: Optional[tf.train.Checkpoint] = None
)
Exports the trained model with signatures.
Example usage:
model_builder = ModelBuilder(...)
dataset_builder = DatasetBuilder(...)
hparams = PipelineHparams(...)
pipeline = BasicModelFitPipeline(model_builder, dataset_builder, hparams)
pipeline.export_saved_model(model_builder.build(), 'saved_model/')
Args | |
---|---|
model
|
Model to be saved. |
export_to
|
Specifies the directory the model is be exported to. |
checkpoint
|
If given, export the model with weights from this checkpoint. |
train_and_validate
train_and_validate(
verbose=0
)
Main function to train the model with TPU strategy.
Example usage:
context_feature_spec = {}
example_feature_spec = {
"example_feature_1": tf.io.FixedLenFeature(
shape=(1,), dtype=tf.float32, default_value=0.0)
}
mask_feature_name = "list_mask"
label_spec = {
"utility": tf.io.FixedLenFeature(
shape=(1,), dtype=tf.float32, default_value=0.0)
}
dataset_hparams = DatasetHparams(
train_input_pattern="train.dat",
valid_input_pattern="valid.dat",
train_batch_size=128,
valid_batch_size=128)
pipeline_hparams = pipeline.PipelineHparams(
model_dir="model/",
num_epochs=2,
steps_per_epoch=5,
validation_steps=2,
learning_rate=0.01,
loss="softmax_loss")
model_builder = SimpleModelBuilder(
context_feature_spec, example_feature_spec, mask_feature_name)
dataset_builder = SimpleDatasetBuilder(
context_feature_spec,
example_feature_spec,
mask_feature_name,
label_spec,
dataset_hparams)
pipeline = BasicModelFitPipeline(
model_builder, dataset_builder, pipeline_hparams)
pipeline.train_and_validate(verbose=1)
Args | |
---|---|
verbose
|
An int for the verbosity level. |