tf.compat.v1.metrics.accuracy is not compatible with eager
execution or tf.function.
Please use tf.keras.metrics.Accuracy instead for TF2 migration. After
instantiating a tf.keras.metrics.Accuracy object, you can first call the
update_state() method to record the prediction/labels, and then call the
result() method to get the accuracy eagerly. You can also attach it to a
Keras model when calling the compile method. Please refer to this
guide
for more details.
# Used within Keras modelmodel.compile(optimizer='sgd',loss='mse',metrics=[tf.keras.metrics.Accuracy()])
Description
The accuracy function creates two local variables, total and
count that are used to compute the frequency with which predictions
matches labels. This frequency is ultimately returned as accuracy: an
idempotent operation that simply divides total by count.
For estimation of the metric over a stream of data, the function creates an
update_op operation that updates these variables and returns the accuracy.
Internally, an is_correct operation computes a Tensor with elements 1.0
where the corresponding elements of predictions and labels match and 0.0
otherwise. Then update_op increments total with the reduced sum of the
product of weights and is_correct, and it increments count with the
reduced sum of weights.
If weights is None, weights default to 1. Use weights of 0 to mask values.
Args
labels
The ground truth values, a Tensor whose shape matches
predictions.
predictions
The predicted values, a Tensor of any shape.
weights
Optional Tensor whose rank is either 0, or the same rank as
labels, and must be broadcastable to labels (i.e., all dimensions must
be either 1, or the same as the corresponding labels dimension).
metrics_collections
An optional list of collections that accuracy should
be added to.
updates_collections
An optional list of collections that update_op should
be added to.
name
An optional variable_scope name.
Returns
accuracy
A Tensor representing the accuracy, the value of total divided
by count.
update_op
An operation that increments the total and count variables
appropriately and whose value matches accuracy.
Raises
ValueError
If predictions and labels have mismatched shapes, or if
weights is not None and its shape doesn't match predictions, or if
either metrics_collections or updates_collections are not a list or
tuple.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2023-10-06 UTC."],[],[],null,["# tf.compat.v1.metrics.accuracy\n\n\u003cbr /\u003e\n\n|--------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v2.13.1/tensorflow/python/ops/metrics_impl.py#L481-L634) |\n\nCalculates how often `predictions` matches `labels`. \n\n tf.compat.v1.metrics.accuracy(\n labels,\n predictions,\n weights=None,\n metrics_collections=None,\n updates_collections=None,\n name=None\n )\n\n\u003cbr /\u003e\n\nMigrate to TF2\n--------------\n\n\u003cbr /\u003e\n\n| **Caution:** This API was designed for TensorFlow v1. Continue reading for details on how to migrate from this API to a native TensorFlow v2 equivalent. See the [TensorFlow v1 to TensorFlow v2 migration guide](https://www.tensorflow.org/guide/migrate) for instructions on how to migrate the rest of your code.\n\n[`tf.compat.v1.metrics.accuracy`](../../../../tf/compat/v1/metrics/accuracy) is not compatible with eager\nexecution or [`tf.function`](../../../../tf/function).\nPlease use [`tf.keras.metrics.Accuracy`](../../../../tf/keras/metrics/Accuracy) instead for TF2 migration. After\ninstantiating a [`tf.keras.metrics.Accuracy`](../../../../tf/keras/metrics/Accuracy) object, you can first call the\n`update_state()` method to record the prediction/labels, and then call the\n`result()` method to get the accuracy eagerly. You can also attach it to a\nKeras model when calling the `compile` method. Please refer to [this\nguide](https://www.tensorflow.org/guide/migrate#new-style_metrics_and_losses)\nfor more details.\n\n#### Structural Mapping to Native TF2\n\nBefore: \n\n accuracy, update_op = tf.compat.v1.metrics.accuracy(\n labels=labels,\n predictions=predictions,\n weights=weights,\n metrics_collections=metrics_collections,\n update_collections=update_collections,\n name=name)\n\nAfter: \n\n m = tf.keras.metrics.Accuracy(\n name=name,\n dtype=None)\n\n m.update_state(\n y_true=labels,\n y_pred=predictions,\n sample_weight=weights)\n\n accuracy = m.result()\n\n#### How to Map Arguments\n\n| TF1 Arg Name | TF2 Arg Name | Note |\n|-----------------------|-----------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `label` | `y_true` | In `update_state()` method |\n| `predictions` | `y_true` | In `update_state()` method |\n| `weights` | `sample_weight` | In `update_state()` method |\n| `metrics_collections` | Not supported | Metrics should be tracked explicitly or with Keras APIs, for example, [add_metric](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Layer#add_metric), instead of via collections |\n| `updates_collections` | Not supported | - |\n| `name` | `name` | In constructor |\n\n#### Before \\& After Usage Example\n\nBefore: \n\n g = tf.Graph()\n with g.as_default():\n logits = [1, 2, 3]\n labels = [0, 2, 3]\n acc, acc_op = tf.compat.v1.metrics.accuracy(logits, labels)\n global_init = tf.compat.v1.global_variables_initializer()\n local_init = tf.compat.v1.local_variables_initializer()\n sess = tf.compat.v1.Session(graph=g)\n sess.run([global_init, local_init])\n print(sess.run([acc, acc_op]))\n [0.0, 0.66667]\n\nAfter: \n\n m = tf.keras.metrics.Accuracy()\n m.update_state([1, 2, 3], [0, 2, 3])\n m.result().numpy()\n 0.66667\n\n # Used within Keras model\n model.compile(optimizer='sgd',\n loss='mse',\n metrics=[tf.keras.metrics.Accuracy()])\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\nDescription\n-----------\n\nThe `accuracy` function creates two local variables, `total` and\n`count` that are used to compute the frequency with which `predictions`\nmatches `labels`. This frequency is ultimately returned as `accuracy`: an\nidempotent operation that simply divides `total` by `count`.\n\nFor estimation of the metric over a stream of data, the function creates an\n`update_op` operation that updates these variables and returns the `accuracy`.\nInternally, an `is_correct` operation computes a `Tensor` with elements 1.0\nwhere the corresponding elements of `predictions` and `labels` match and 0.0\notherwise. Then `update_op` increments `total` with the reduced sum of the\nproduct of `weights` and `is_correct`, and it increments `count` with the\nreduced sum of `weights`.\n\nIf `weights` is `None`, weights default to 1. Use weights of 0 to mask values.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|-----------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `labels` | The ground truth values, a `Tensor` whose shape matches `predictions`. |\n| `predictions` | The predicted values, a `Tensor` of any shape. |\n| `weights` | Optional `Tensor` whose rank is either 0, or the same rank as `labels`, and must be broadcastable to `labels` (i.e., all dimensions must be either `1`, or the same as the corresponding `labels` dimension). |\n| `metrics_collections` | An optional list of collections that `accuracy` should be added to. |\n| `updates_collections` | An optional list of collections that `update_op` should be added to. |\n| `name` | An optional variable_scope name. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|-------------|------------------------------------------------------------------------------------------------------------------|\n| `accuracy` | A `Tensor` representing the accuracy, the value of `total` divided by `count`. |\n| `update_op` | An operation that increments the `total` and `count` variables appropriately and whose value matches `accuracy`. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Raises ------ ||\n|----------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `ValueError` | If `predictions` and `labels` have mismatched shapes, or if `weights` is not `None` and its shape doesn't match `predictions`, or if either `metrics_collections` or `updates_collections` are not a list or tuple. |\n| `RuntimeError` | If eager execution is enabled. |\n\n\u003cbr /\u003e"]]