tf.contrib.metrics.precision_recall_at_equal_thresholds
Stay organized with collections
Save and categorize content based on your preferences.
A helper method for creating metrics related to precision-recall curves.
tf.contrib.metrics.precision_recall_at_equal_thresholds(
labels, predictions, weights=None, num_thresholds=None, use_locking=None,
name=None
)
These values are true positives, false negatives, true negatives, false
positives, precision, and recall. This function returns a data structure that
contains ops within it.
Unlike _streaming_confusion_matrix_at_thresholds (which exhibits O(T * N)
space and run time), this op exhibits O(T + N) space and run time, where T is
the number of thresholds and N is the size of the predictions tensor. Hence,
it may be advantageous to use this function when predictions
is big.
For instance, prefer this method for per-pixel classification tasks, for which
the predictions tensor may be very large.
Each number in predictions
, a float in [0, 1]
, is compared with its
corresponding label in labels
, and counts as a single tp/fp/tn/fn value at
each threshold. This is then multiplied with weights
which can be used to
reweight certain values, or more commonly used for masking values.
Args |
labels
|
A bool Tensor whose shape matches predictions .
|
predictions
|
A floating point Tensor of arbitrary shape and whose values
are in the range [0, 1] .
|
weights
|
Optional; If provided, a Tensor that has the same dtype as, and
broadcastable to, predictions . This tensor is multiplied by counts.
|
num_thresholds
|
Optional; Number of thresholds, evenly distributed in [0,
1] . Should be >= 2 . Defaults to 201. Note that the number of bins is 1
less than num_thresholds . Using an even num_thresholds value instead
of an odd one may yield unfriendly edges for bins.
|
use_locking
|
Optional; If True, the op will be protected by a lock.
Otherwise, the behavior is undefined, but may exhibit less contention.
Defaults to True.
|
name
|
Optional; variable_scope name. If not provided, the string
'precision_recall_at_equal_threshold' is used.
|
Returns |
result
|
A named tuple (See PrecisionRecallData within the implementation of
this function) with properties that are variables of shape
[num_thresholds] . The names of the properties are tp, fp, tn, fn,
precision, recall, thresholds. Types are same as that of predictions.
|
update_op
|
An op that accumulates values.
|
Raises |
ValueError
|
If predictions and labels have mismatched shapes, or if
weights is not None and its shape doesn't match predictions , or if
includes contains invalid keys.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2020-10-01 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2020-10-01 UTC."],[],[],null,["# tf.contrib.metrics.precision_recall_at_equal_thresholds\n\n\u003cbr /\u003e\n\n|------------------------------------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/tensorflow/blob/v1.15.0/tensorflow/contrib/metrics/python/ops/metric_ops.py#L1519-L1702) |\n\nA helper method for creating metrics related to precision-recall curves. \n\n tf.contrib.metrics.precision_recall_at_equal_thresholds(\n labels, predictions, weights=None, num_thresholds=None, use_locking=None,\n name=None\n )\n\nThese values are true positives, false negatives, true negatives, false\npositives, precision, and recall. This function returns a data structure that\ncontains ops within it.\n\nUnlike _streaming_confusion_matrix_at_thresholds (which exhibits O(T \\* N)\nspace and run time), this op exhibits O(T + N) space and run time, where T is\nthe number of thresholds and N is the size of the predictions tensor. Hence,\nit may be advantageous to use this function when `predictions` is big.\n\nFor instance, prefer this method for per-pixel classification tasks, for which\nthe predictions tensor may be very large.\n\nEach number in `predictions`, a float in `[0, 1]`, is compared with its\ncorresponding label in `labels`, and counts as a single tp/fp/tn/fn value at\neach threshold. This is then multiplied with `weights` which can be used to\nreweight certain values, or more commonly used for masking values.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Args ---- ||\n|------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `labels` | A bool `Tensor` whose shape matches `predictions`. |\n| `predictions` | A floating point `Tensor` of arbitrary shape and whose values are in the range `[0, 1]`. |\n| `weights` | Optional; If provided, a `Tensor` that has the same dtype as, and broadcastable to, `predictions`. This tensor is multiplied by counts. |\n| `num_thresholds` | Optional; Number of thresholds, evenly distributed in `[0, 1]`. Should be `\u003e= 2`. Defaults to 201. Note that the number of bins is 1 less than `num_thresholds`. Using an even `num_thresholds` value instead of an odd one may yield unfriendly edges for bins. |\n| `use_locking` | Optional; If True, the op will be protected by a lock. Otherwise, the behavior is undefined, but may exhibit less contention. Defaults to True. |\n| `name` | Optional; variable_scope name. If not provided, the string 'precision_recall_at_equal_threshold' is used. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Returns ------- ||\n|-------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `result` | A named tuple (See PrecisionRecallData within the implementation of this function) with properties that are variables of shape `[num_thresholds]`. The names of the properties are tp, fp, tn, fn, precision, recall, thresholds. Types are same as that of predictions. |\n| `update_op` | An op that accumulates values. |\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Raises ------ ||\n|--------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| `ValueError` | If `predictions` and `labels` have mismatched shapes, or if `weights` is not `None` and its shape doesn't match `predictions`, or if `includes` contains invalid keys. |\n\n\u003cbr /\u003e"]]