estimator=...# Hook to stop training if accuracy becomes higher than 0.9.hook=early_stopping.stop_if_higher_hook(estimator,"accuracy",0.9)train_spec=tf.estimator.TrainSpec(...,hooks=[hook])tf.estimator.train_and_evaluate(estimator,train_spec,...)
Caveat: Current implementation supports early-stopping both training and
evaluation in local mode. In distributed mode, training can be stopped but
evaluation (where it's a separate job) will indefinitely wait for new model
checkpoints to evaluate, so you will need other means to detect and stop it.
Early-stopping evaluation in distributed mode requires changes in
train_and_evaluate API and will be addressed in a future revision.
If set, directory containing summary files with eval metrics. By
default, estimator.eval_dir() will be used.
min_steps
int, stop is never requested if global step is less than this
value. Defaults to 0.
run_every_secs
If specified, calls should_stop_fn at an interval of
run_every_secs seconds. Defaults to 60 seconds. Either this or
run_every_steps must be set.
run_every_steps
If specified, calls should_stop_fn every
run_every_steps steps. Either this or run_every_secs must be set.
Returns
An early-stopping hook of type SessionRunHook that periodically checks
if the given metric is higher than specified threshold and initiates
early stopping if true.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2023-10-06 UTC."],[],[]]