View source on GitHub |
Computes the Intersection-Over-Union metric for specific target classes.
Inherits From: Metric
tf.keras.metrics.IoU(
num_classes,
target_class_ids,
name=None,
dtype=None,
ignore_class=None,
sparse_y_true=True,
sparse_y_pred=True,
axis=-1
)
Formula:
iou = true_positives / (true_positives + false_positives + false_negatives)
Intersection-Over-Union is a common evaluation metric for semantic image segmentation.
To compute IoUs, the predictions are accumulated in a confusion matrix,
weighted by sample_weight
and the metric is then calculated from it.
If sample_weight
is None
, weights default to 1.
Use sample_weight
of 0 to mask values.
Note, this class first computes IoUs for all individual classes, then
returns the mean of IoUs for the classes that are specified by
target_class_ids
. If target_class_ids
has only one id value, the IoU of
that specific class is returned.
Examples:
# cm = [[1, 1],
# [1, 1]]
# sum_row = [2, 2], sum_col = [2, 2], true_positives = [1, 1]
# iou = true_positives / (sum_row + sum_col - true_positives))
# iou = [0.33, 0.33]
m = keras.metrics.IoU(num_classes=2, target_class_ids=[0])
m.update_state([0, 0, 1, 1], [0, 1, 0, 1])
m.result()
0.33333334
m.reset_state()
m.update_state([0, 0, 1, 1], [0, 1, 0, 1],
sample_weight=[0.3, 0.3, 0.3, 0.1])
# cm = [[0.3, 0.3],
# [0.3, 0.1]]
# sum_row = [0.6, 0.4], sum_col = [0.6, 0.4],
# true_positives = [0.3, 0.1]
# iou = [0.33, 0.14]
m.result()
0.33333334
Usage with compile()
API:
model.compile(
optimizer='sgd',
loss='mse',
metrics=[keras.metrics.IoU(num_classes=2, target_class_ids=[0])])
Attributes | |
---|---|
dtype
|
|
variables
|
Methods
add_variable
add_variable(
shape, initializer, dtype=None, aggregation='sum', name=None
)
add_weight
add_weight(
shape=(), initializer=None, dtype=None, name=None
)
from_config
@classmethod
from_config( config )
get_config
get_config()
Return the serializable config of the metric.
reset_state
reset_state()
Reset all of the metric state variables.
This function is called between epochs/steps, when a metric is evaluated during training.
result
result()
Compute the intersection-over-union via the confusion matrix.
stateless_reset_state
stateless_reset_state()
stateless_result
stateless_result(
metric_variables
)
stateless_update_state
stateless_update_state(
metric_variables, *args, **kwargs
)
update_state
update_state(
y_true, y_pred, sample_weight=None
)
Accumulates the confusion matrix statistics.
Args | |
---|---|
y_true
|
The ground truth values. |
y_pred
|
The predicted values. |
sample_weight
|
Optional weighting of each example. Can
be a Tensor whose rank is either 0, or the same as y_true ,
and must be broadcastable to y_true . Defaults to 1 .
|
Returns | |
---|---|
Update op. |
__call__
__call__(
*args, **kwargs
)
Call self as a function.