View source on GitHub |
Computes Kullback-Leibler divergence loss between y_true
& y_pred
.
Inherits From: Loss
tf.keras.losses.KLDivergence(
reduction='sum_over_batch_size', name='kl_divergence'
)
Formula:
loss = y_true * log(y_true / y_pred)
y_true
and y_pred
are expected to be probability
distributions, with values between 0 and 1. They will get
clipped to the [0, 1]
range.
Methods
call
call(
y_true, y_pred
)
from_config
@classmethod
from_config( config )
get_config
get_config()
__call__
__call__(
y_true, y_pred, sample_weight=None
)
Call self as a function.