Warning: This project is deprecated. TensorFlow Addons has stopped development,
The project will only be providing minimal maintenance releases until May 2024. See the full
announcement here or on
github.
TensorFlow Addons Callbacks: TimeStopping
Stay organized with collections
Save and categorize content based on your preferences.
Overview
This notebook will demonstrate how to use TimeStopping Callback in TensorFlow Addons.
Setup
pip install -U tensorflow-addons
import tensorflow_addons as tfa
from tensorflow.keras.datasets import mnist
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Dropout, Flatten
Import and Normalize Data
# the data, split between train and test sets
(x_train, y_train), (x_test, y_test) = mnist.load_data()
# normalize data
x_train, x_test = x_train / 255.0, x_test / 255.0
Build Simple MNIST CNN Model
# build the model using the Sequential API
model = Sequential()
model.add(Flatten(input_shape=(28, 28)))
model.add(Dense(128, activation='relu'))
model.add(Dropout(0.2))
model.add(Dense(10, activation='softmax'))
model.compile(optimizer='adam',
loss = 'sparse_categorical_crossentropy',
metrics=['accuracy'])
Simple TimeStopping Usage
# initialize TimeStopping callback
time_stopping_callback = tfa.callbacks.TimeStopping(seconds=5, verbose=1)
# train the model with tqdm_callback
# make sure to set verbose = 0 to disable
# the default progress bar.
model.fit(x_train, y_train,
batch_size=64,
epochs=100,
callbacks=[time_stopping_callback],
validation_data=(x_test, y_test))
Epoch 1/100
938/938 [==============================] - 3s 2ms/step - loss: 0.3420 - accuracy: 0.9022 - val_loss: 0.1628 - val_accuracy: 0.9522
Epoch 2/100
938/938 [==============================] - 2s 2ms/step - loss: 0.1636 - accuracy: 0.9514 - val_loss: 0.1122 - val_accuracy: 0.9656
Epoch 3/100
938/938 [==============================] - 2s 2ms/step - loss: 0.1213 - accuracy: 0.9642 - val_loss: 0.0932 - val_accuracy: 0.9711
Timed stopping at epoch 3 after training for 0:00:05
<keras.callbacks.History at 0x7fb5df60c130>
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2023-05-26 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2023-05-26 UTC."],[],[],null,["# TensorFlow Addons Callbacks: TimeStopping\n\n\u003cbr /\u003e\n\n|-------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------|\n| [View on TensorFlow.org](https://www.tensorflow.org/addons/tutorials/time_stopping) | [Run in Google Colab](https://colab.research.google.com/github/tensorflow/addons/blob/master/docs/tutorials/time_stopping.ipynb) | [View source on GitHub](https://github.com/tensorflow/addons/blob/master/docs/tutorials/time_stopping.ipynb) | [Download notebook](https://storage.googleapis.com/tensorflow_docs/addons/docs/tutorials/time_stopping.ipynb) |\n\nOverview\n--------\n\nThis notebook will demonstrate how to use TimeStopping Callback in TensorFlow Addons.\n\nSetup\n-----\n\n pip install -U tensorflow-addons\n\n import tensorflow_addons as tfa\n\n from tensorflow.keras.datasets import mnist\n from tensorflow.keras.models import Sequential\n from tensorflow.keras.layers import Dense, Dropout, Flatten\n\nImport and Normalize Data\n-------------------------\n\n # the data, split between train and test sets\n (x_train, y_train), (x_test, y_test) = mnist.load_data()\n # normalize data\n x_train, x_test = x_train / 255.0, x_test / 255.0\n\nBuild Simple MNIST CNN Model\n----------------------------\n\n # build the model using the Sequential API\n model = Sequential()\n model.add(Flatten(input_shape=(28, 28)))\n model.add(Dense(128, activation='relu'))\n model.add(Dropout(0.2))\n model.add(Dense(10, activation='softmax'))\n\n model.compile(optimizer='adam',\n loss = 'sparse_categorical_crossentropy',\n metrics=['accuracy'])\n\nSimple TimeStopping Usage\n-------------------------\n\n # initialize TimeStopping callback \n time_stopping_callback = tfa.callbacks.TimeStopping(seconds=5, verbose=1)\n\n # train the model with tqdm_callback\n # make sure to set verbose = 0 to disable\n # the default progress bar.\n model.fit(x_train, y_train,\n batch_size=64,\n epochs=100,\n callbacks=[time_stopping_callback],\n validation_data=(x_test, y_test))\n\n```\nEpoch 1/100\n938/938 [==============================] - 3s 2ms/step - loss: 0.3420 - accuracy: 0.9022 - val_loss: 0.1628 - val_accuracy: 0.9522\nEpoch 2/100\n938/938 [==============================] - 2s 2ms/step - loss: 0.1636 - accuracy: 0.9514 - val_loss: 0.1122 - val_accuracy: 0.9656\nEpoch 3/100\n938/938 [==============================] - 2s 2ms/step - loss: 0.1213 - accuracy: 0.9642 - val_loss: 0.0932 - val_accuracy: 0.9711\nTimed stopping at epoch 3 after training for 0:00:05\n\u003ckeras.callbacks.History at 0x7fb5df60c130\u003e\n```"]]