tf.keras.activations.softmax
Stay organized with collections
Save and categorize content based on your preferences.
The softmax activation function transforms the outputs so that all values are in
tf.keras.activations.softmax(
x, axis=-1
)
range (0, 1) and sum to 1. It is often used as the activation for the last
layer of a classification network because the result could be interpreted as
a probability distribution. The softmax of x is calculated by
exp(x)/tf.reduce_sum(exp(x)).
Arguments |
x
|
Input tensor.
|
axis
|
Integer, axis along which the softmax normalization is applied.
|
Returns |
Tensor, output of softmax transformation (all values are non-negative
and sum to 1).
|
Raises |
ValueError
|
In case dim(x) == 1 .
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2020-10-01 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2020-10-01 UTC."],[],[]]