tf.nn.compute_average_loss
Stay organized with collections
Save and categorize content based on your preferences.
Scales per-example losses with sample_weights and computes their average.
tf.nn.compute_average_loss(
per_example_loss, sample_weight=None, global_batch_size=None
)
Usage with distribution strategy and custom training loop:
with strategy.scope():
def compute_loss(labels, predictions, sample_weight=None):
# If you are using a `Loss` class instead, set reduction to `NONE` so that
# we can do the reduction afterwards and divide by global batch size.
per_example_loss = tf.keras.losses.sparse_categorical_crossentropy(
labels, predictions)
# Compute loss that is scaled by sample_weight and by global batch size.
return tf.nn.compute_average_loss(
per_example_loss,
sample_weight=sample_weight,
global_batch_size=GLOBAL_BATCH_SIZE)
Args |
per_example_loss
|
Per-example loss.
|
sample_weight
|
Optional weighting for each example.
|
global_batch_size
|
Optional global batch size value. Defaults to (size of
first dimension of losses ) * (number of replicas).
|
Returns |
Scalar loss value, obtained by summing the per_example_loss and dividing
by global_batch_size . If global_batch_size is zero, the result is zero.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2023-10-06 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2023-10-06 UTC."],[],[]]