tf.keras.metrics.binary_crossentropy
Stay organized with collections
Save and categorize content based on your preferences.
Computes the binary crossentropy loss.
tf.keras.metrics.binary_crossentropy(
y_true, y_pred, from_logits=False, label_smoothing=0.0, axis=-1
)
Standalone usage:
y_true = [[0, 1], [0, 0]]
y_pred = [[0.6, 0.4], [0.4, 0.6]]
loss = tf.keras.losses.binary_crossentropy(y_true, y_pred)
assert loss.shape == (2,)
loss.numpy()
array([0.916 , 0.714], dtype=float32)
Args |
y_true
|
Ground truth values. shape = [batch_size, d0, .. dN] .
|
y_pred
|
The predicted values. shape = [batch_size, d0, .. dN] .
|
from_logits
|
Whether y_pred is expected to be a logits tensor. By default,
we assume that y_pred encodes a probability distribution.
|
label_smoothing
|
Float in [0, 1]. If > 0 then smooth the labels by
squeezing them towards 0.5 That is, using 1. - 0.5 * label_smoothing
for the target class and 0.5 * label_smoothing for the non-target class.
|
axis
|
The axis along which the mean is computed. Defaults to -1.
|
Returns |
Binary crossentropy loss value. shape = [batch_size, d0, .. dN-1] .
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2022-10-27 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2022-10-27 UTC."],[],[]]