|  View source on GitHub | 
Weighted cross-entropy loss for a sequence of logits.
tfa.seq2seq.SequenceLoss(
    average_across_timesteps: bool = False,
    average_across_batch: bool = False,
    sum_over_timesteps: bool = True,
    sum_over_batch: bool = True,
    softmax_loss_function: Optional[Callable] = None,
    name: Optional[str] = None
)
| Args | |
|---|---|
| reduction | Type of tf.keras.losses.Reductionto apply to
loss. Default value isAUTO.AUTOindicates that the reduction
option will be determined by the usage context. For almost all cases
this defaults toSUM_OVER_BATCH_SIZE. When used under atf.distribute.Strategy, except viaModel.compile()andModel.fit(), usingAUTOorSUM_OVER_BATCH_SIZEwill raise an error. Please see this custom training tutorial
  for more details. | 
| name | Optional name for the instance. | 
Methods
from_config
@classmethodfrom_config( config )
Instantiates a Loss from its config (output of get_config()).
| Args | |
|---|---|
| config | Output of get_config(). | 
| Returns | |
|---|---|
| A Lossinstance. | 
get_config
get_config()
Returns the config dictionary for a Loss instance.
__call__
__call__(
    y_true, y_pred, sample_weight=None
)
Override the parent call to have a customized reduce behavior.