tf.distribute.experimental.coordinator.RemoteValue
Stay organized with collections
Save and categorize content based on your preferences.
An asynchronously available value of a scheduled function.
This class is used as the return value of
tf.distribute.experimental.coordinator.ClusterCoordinator.schedule
where
the underlying value becomes available at a later time once the function has
been executed.
Using tf.distribute.experimental.coordinator.RemoteValue
as an input to
a subsequent function scheduled with
tf.distribute.experimental.coordinator.ClusterCoordinator.schedule
is
currently not supported.
Example:
strategy = tf.distribute.experimental.ParameterServerStrategy(
cluster_resolver=...)
coordinator = (
tf.distribute.experimental.coordinator.ClusterCoordinator(strategy))
with strategy.scope():
v1 = tf.Variable(initial_value=0.0)
v2 = tf.Variable(initial_value=1.0)
@tf.function
def worker_fn():
v1.assign_add(0.1)
v2.assign_sub(0.2)
return v1.read_value() / v2.read_value()
result = coordinator.schedule(worker_fn)
# Note that `fetch()` gives the actual result instead of a `tf.Tensor`.
assert result.fetch() == 0.125
for _ in range(10):
# `worker_fn` will be run on arbitrary workers that are available. The
# `result` value will be available later.
result = coordinator.schedule(worker_fn)
Methods
fetch
View source
fetch()
Wait for the result of RemoteValue
to be ready and return the result.
This makes the value concrete by copying the remote value to local.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2021-05-14 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2021-05-14 UTC."],[],[],null,["# tf.distribute.experimental.coordinator.RemoteValue\n\n\u003cbr /\u003e\n\n|---------------------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v2.5.0/tensorflow/python/distribute/coordinator/cluster_coordinator.py#L103-L161) |\n\nAn asynchronously available value of a scheduled function.\n\nThis class is used as the return value of\n[`tf.distribute.experimental.coordinator.ClusterCoordinator.schedule`](../../../../tf/distribute/experimental/coordinator/ClusterCoordinator#schedule) where\nthe underlying value becomes available at a later time once the function has\nbeen executed.\n\nUsing [`tf.distribute.experimental.coordinator.RemoteValue`](../../../../tf/distribute/experimental/coordinator/RemoteValue) as an input to\na subsequent function scheduled with\n[`tf.distribute.experimental.coordinator.ClusterCoordinator.schedule`](../../../../tf/distribute/experimental/coordinator/ClusterCoordinator#schedule) is\ncurrently not supported.\n\n#### Example:\n\n strategy = tf.distribute.experimental.ParameterServerStrategy(\n cluster_resolver=...)\n coordinator = (\n tf.distribute.experimental.coordinator.ClusterCoordinator(strategy))\n\n with strategy.scope():\n v1 = tf.Variable(initial_value=0.0)\n v2 = tf.Variable(initial_value=1.0)\n\n @tf.function\n def worker_fn():\n v1.assign_add(0.1)\n v2.assign_sub(0.2)\n return v1.read_value() / v2.read_value()\n\n result = coordinator.schedule(worker_fn)\n # Note that `fetch()` gives the actual result instead of a `tf.Tensor`.\n assert result.fetch() == 0.125\n\n for _ in range(10):\n # `worker_fn` will be run on arbitrary workers that are available. The\n # `result` value will be available later.\n result = coordinator.schedule(worker_fn)\n\nMethods\n-------\n\n### `fetch`\n\n[View source](https://github.com/tensorflow/tensorflow/blob/v2.5.0/tensorflow/python/distribute/coordinator/cluster_coordinator.py#L145-L161) \n\n fetch()\n\nWait for the result of `RemoteValue` to be ready and return the result.\n\nThis makes the value concrete by copying the remote value to local.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ||\n|---|---|\n| The actual output of the [`tf.function`](../../../../tf/function) associated with this `RemoteValue`, previously by a [`tf.distribute.experimental.coordinator.ClusterCoordinator.schedule`](../../../../tf/distribute/experimental/coordinator/ClusterCoordinator#schedule) call. This can be a single value, or a structure of values, depending on the output of the [`tf.function`](../../../../tf/function). ||\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Raises ||\n|---------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------|\n| [`tf.errors.CancelledError`](https://www.tensorflow.org/api_docs/python/tf/errors/CancelledError) | If the function that produces this `RemoteValue` is aborted or cancelled due to failure. |\n\n\u003cbr /\u003e"]]