SoftmaxCrossEntropyWithLogits
Stay organized with collections
Save and categorize content based on your preferences.
Inherited Methods
From class
java.lang.Object
boolean
|
equals(Object arg0)
|
final
Class<?>
|
getClass()
|
int
|
hashCode()
|
final
void
|
notify()
|
final
void
|
notifyAll()
|
String
|
toString()
|
final
void
|
wait(long arg0, int arg1)
|
final
void
|
wait(long arg0)
|
final
void
|
wait()
|
Public Constructors
public
SoftmaxCrossEntropyWithLogits
()
Public Methods
public
static
Operand<T>
softmaxCrossEntropyWithLogits
(Scope scope, Operand<U> labels, Operand<T> logits, int axis)
Computes softmax cross entropy between logits
and labels
.
Measures the probability error in discrete classification tasks in which the classes are
mutually exclusive (each entry is in exactly one class). For example, each CIFAR-10 image is
labeled with one and only one label: an image can be a dog or a truck, but not both.
NOTE:
While the classes are mutually exclusive, their probabilities need not be. All that is
required is that each row of labels
is a valid probability distribution. If they
are not, the computation of the gradient will be incorrect.
If using exclusive labels
(wherein one and only one class is true at a time),
see ERROR(/org.tensorflow.op.NnOps#sparseSoftmaxCrossEntropyWithLogits)
Usage:
Operand<TFloat32> logits =
tf.constant(new float[][] { {4.0F, 2.0F, 1.0F}, {0.0F, 5.0F, 1.0F} } );
Operand<TFloat32> labels =
tf.constant(new float[][] { {1.0F, 0.0F, 0.0F}, {0.0F, 0.8F, 0.2F} } );
Operand<TFloat32> output =
tf.nn.softmaxCrossEntropyWithLogits(labels, logits, -1);
// output Shape = [2]
// dataType = FLOAT (1)
// values { 0.169846, 0.824745 }
Backpropagation will happen into both logits
and labels
. To
disallow backpropagation into labels
, pass label tensors through
tf.stopGradient
before feeding it to this function.
Parameters
scope |
current scope |
labels |
Each vector along the class dimension should hold a valid probability
distribution e.g. for the case in which labels are of shape [batch_size, num_classes]
, each row of labels[i] must be a valid probability distribution. |
logits |
Per-label activations, typically a linear output. These activation energies are
interpreted as unnormalized log probabilities. |
axis |
The class dimension. -1 is the last dimension. |
Returns
- the softmax cross entropy loss. Its type is the same as
logits
and its
shape is the same as labels
except that it does not have the last dimension of
labels
.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2021-11-29 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2021-11-29 UTC."],[],[],null,["# SoftmaxCrossEntropyWithLogits\n\npublic class **SoftmaxCrossEntropyWithLogits** \n\n### Public Constructors\n\n|---|------------------------------------------------------------------------------------------------------------------------------------------|\n| | [SoftmaxCrossEntropyWithLogits](/jvm/api_docs/java/org/tensorflow/op/nn/SoftmaxCrossEntropyWithLogits#SoftmaxCrossEntropyWithLogits())() |\n\n### Public Methods\n\n|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| static \\\u003cT extends [TNumber](/jvm/api_docs/java/org/tensorflow/types/family/TNumber), U extends [TNumber](/jvm/api_docs/java/org/tensorflow/types/family/TNumber)\\\u003e [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e | [softmaxCrossEntropyWithLogits](/jvm/api_docs/java/org/tensorflow/op/nn/SoftmaxCrossEntropyWithLogits#softmaxCrossEntropyWithLogits(org.tensorflow.op.Scope, org.tensorflow.Operand\u003cU\u003e, org.tensorflow.Operand\u003cT\u003e, int))([Scope](/jvm/api_docs/java/org/tensorflow/op/Scope) scope, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cU\\\u003e labels, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e logits, int axis) Computes softmax cross entropy between `logits` and `labels`. |\n\n### Inherited Methods\n\nFrom class java.lang.Object \n\n|------------------|---------------------------|\n| boolean | equals(Object arg0) |\n| final Class\\\u003c?\\\u003e | getClass() |\n| int | hashCode() |\n| final void | notify() |\n| final void | notifyAll() |\n| String | toString() |\n| final void | wait(long arg0, int arg1) |\n| final void | wait(long arg0) |\n| final void | wait() |\n\nPublic Constructors\n-------------------\n\n#### public\n**SoftmaxCrossEntropyWithLogits**\n()\n\n\u003cbr /\u003e\n\nPublic Methods\n--------------\n\n#### public static [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e\n**softmaxCrossEntropyWithLogits**\n([Scope](/jvm/api_docs/java/org/tensorflow/op/Scope) scope, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cU\\\u003e labels, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e logits, int axis)\n\nComputes softmax cross entropy between `logits` and `labels`.\n\nMeasures the probability error in discrete classification tasks in which the classes are\nmutually exclusive (each entry is in exactly one class). For example, each CIFAR-10 image is\nlabeled with one and only one label: an image can be a dog or a truck, but not both.\n\n**NOTE:**\n\nWhile the classes are mutually exclusive, their probabilities need not be. All that is\nrequired is that each row of `labels` is a valid probability distribution. If they\nare not, the computation of the gradient will be incorrect.\n\nIf using exclusive `labels` (wherein one and only one class is true at a time),\nsee [ERROR(/org.tensorflow.op.NnOps#sparseSoftmaxCrossEntropyWithLogits)]()\n\nUsage:\n\n```\n Operand\u003cTFloat32\u003e logits =\n tf.constant(new float[][] { {4.0F, 2.0F, 1.0F}, {0.0F, 5.0F, 1.0F} } );\n Operand\u003cTFloat32\u003e labels =\n tf.constant(new float[][] { {1.0F, 0.0F, 0.0F}, {0.0F, 0.8F, 0.2F} } );\n Operand\u003cTFloat32\u003e output =\n tf.nn.softmaxCrossEntropyWithLogits(labels, logits, -1);\n // output Shape = [2]\n // dataType = FLOAT (1)\n // values { 0.169846, 0.824745 }\n \n```\n\nBackpropagation will happen into both `logits` and `labels`. To\ndisallow backpropagation into `labels`, pass label tensors through `\ntf.stopGradient` before feeding it to this function.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n##### Parameters\n\n| scope | current scope |\n| labels | Each vector along the class dimension should hold a valid probability distribution e.g. for the case in which labels are of shape `[batch_size, num_classes] `, each row of `labels[i]` must be a valid probability distribution. |\n| logits | Per-label activations, typically a linear output. These activation energies are interpreted as unnormalized log probabilities. |\n| axis | The class dimension. -1 is the last dimension. |\n|--------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n\n##### Returns\n\n- the softmax cross entropy loss. Its type is the same as `logits` and its shape is the same as `labels` except that it does not have the last dimension of `labels`."]]