SparseSoftmaxCrossEntropyWithLogits
Stay organized with collections
Save and categorize content based on your preferences.
Inherited Methods
From class
java.lang.Object
boolean
|
equals(Object arg0)
|
final
Class<?>
|
getClass()
|
int
|
hashCode()
|
final
void
|
notify()
|
final
void
|
notifyAll()
|
String
|
toString()
|
final
void
|
wait(long arg0, int arg1)
|
final
void
|
wait(long arg0)
|
final
void
|
wait()
|
Public Constructors
public
SparseSoftmaxCrossEntropyWithLogits
()
Public Methods
public
static
Operand
sparseSoftmaxCrossEntropyWithLogits
(Scope scope, Operand<T> labels, Operand<U> logits)
Computes sparse softmax cross entropy between logits
and labels
.
Measures the probability error in discrete classification tasks in which the classes are
mutually exclusive (each entry is in exactly one class). For example, each CIFAR-10 image is
labeled with one and only one label: an image can be a dog or a truck, but not both.
NOTE:
For this operation, the probability of a given label is considered exclusive. That is, soft
classes are not allowed, and the labels
vector must provide a single specific
index for the true class for each row of logits
(each minibatch entry). For soft
softmax classification with a probability distribution for each entry, ERROR(/org.tensorflow.op.NnOps#softmaxCrossEntropyWithLogits)
.
WARNING:
This op expects unscaled logits, since it performs a softmax
on logits
internally for efficiency. Do not call this op with the output of softmax
,
as it will produce incorrect results.
A common use case is to have logits of shape [batchSize, numClasses]
and have
labels of shape [batchSize]
, but higher dimensions are supported, in which case
the dim
-th dimension is assumed to be of size numClasses
.
logits
must have the dataType of TFloat16
, TFloat32
, or TFloat64
, and labels
must have the dtype of TInt32
or TInt64
.
Parameters
scope |
current scope |
labels |
Tensor of shape [d_0, d_1, ..., d_{r-1}] (where r
is rank of labels and result) and the dataType is TInt32
or TInt64 . Each entry in labels must be an index in [0,
numClasses) . Other values will raise an exception when this op is run on CPU, and
return NaN for corresponding loss and gradient rows on GPU. |
logits |
Per-label activations (typically a linear output) of shape [d_0, d_1, ...,
d_{r-1}, numClasses] and dataType of TFloat16 , TFloat32 ,
or TFloat64 . These activation energies are interpreted as unnormalized log
probabilities. |
Returns
- A
Tensor
of the same shape as labels
and of the same type as
logits
with the softmax cross entropy loss.
Throws
IllegalArgumentException |
If logits are scalars (need to have rank >= 1) or if the rank
of the labels is not equal to the rank of the logits minus one.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2021-11-29 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2021-11-29 UTC."],[],[],null,["# SparseSoftmaxCrossEntropyWithLogits\n\npublic class **SparseSoftmaxCrossEntropyWithLogits** \n\n### Public Constructors\n\n|---|------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| | [SparseSoftmaxCrossEntropyWithLogits](/jvm/api_docs/java/org/tensorflow/op/nn/SparseSoftmaxCrossEntropyWithLogits#SparseSoftmaxCrossEntropyWithLogits())() |\n\n### Public Methods\n\n|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| static \\\u003cT extends [TNumber](/jvm/api_docs/java/org/tensorflow/types/family/TNumber), U extends [TNumber](/jvm/api_docs/java/org/tensorflow/types/family/TNumber)\\\u003e [Operand](/jvm/api_docs/java/org/tensorflow/Operand) | [sparseSoftmaxCrossEntropyWithLogits](/jvm/api_docs/java/org/tensorflow/op/nn/SparseSoftmaxCrossEntropyWithLogits#sparseSoftmaxCrossEntropyWithLogits(org.tensorflow.op.Scope, org.tensorflow.Operand\u003cT\u003e, org.tensorflow.Operand\u003cU\u003e))([Scope](/jvm/api_docs/java/org/tensorflow/op/Scope) scope, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e labels, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cU\\\u003e logits) Computes sparse softmax cross entropy between `logits` and `labels`. |\n\n### Inherited Methods\n\nFrom class java.lang.Object \n\n|------------------|---------------------------|\n| boolean | equals(Object arg0) |\n| final Class\\\u003c?\\\u003e | getClass() |\n| int | hashCode() |\n| final void | notify() |\n| final void | notifyAll() |\n| String | toString() |\n| final void | wait(long arg0, int arg1) |\n| final void | wait(long arg0) |\n| final void | wait() |\n\nPublic Constructors\n-------------------\n\n#### public\n**SparseSoftmaxCrossEntropyWithLogits**\n()\n\n\u003cbr /\u003e\n\nPublic Methods\n--------------\n\n#### public static [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\n**sparseSoftmaxCrossEntropyWithLogits**\n([Scope](/jvm/api_docs/java/org/tensorflow/op/Scope) scope, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e labels, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cU\\\u003e logits)\n\nComputes sparse softmax cross entropy between `logits` and `labels`.\n\nMeasures the probability error in discrete classification tasks in which the classes are\nmutually exclusive (each entry is in exactly one class). For example, each CIFAR-10 image is\nlabeled with one and only one label: an image can be a dog or a truck, but not both.\n\n**NOTE:**\n\nFor this operation, the probability of a given label is considered exclusive. That is, soft\nclasses are not allowed, and the `labels` vector must provide a single specific\nindex for the true class for each row of `logits` (each minibatch entry). For soft\nsoftmax classification with a probability distribution for each entry, [ERROR(/org.tensorflow.op.NnOps#softmaxCrossEntropyWithLogits)]().\n\n**WARNING:**\n\nThis op expects unscaled logits, since it performs a `softmax` on `logits\n` internally for efficiency. Do not call this op with the output of `softmax`,\nas it will produce incorrect results.\n\nA common use case is to have logits of shape `[batchSize, numClasses]` and have\nlabels of shape `[batchSize]`, but higher dimensions are supported, in which case\nthe `dim`-th dimension is assumed to be of size `numClasses`. `\nlogits` must have the dataType of `TFloat16`, `TFloat32`\n, or `TFloat64`, and `labels` must have the dtype of `TInt32`\nor `TInt64`.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n##### Parameters\n\n| scope | current scope |\n| labels | `Tensor` of shape `[d_0, d_1, ..., d_{r-1}]` (where `r ` is rank of `labels` and result) and the dataType is `TInt32` or `TInt64`. Each entry in `labels` must be an index in `[0, numClasses)`. Other values will raise an exception when this op is run on CPU, and return `NaN` for corresponding loss and gradient rows on GPU. |\n| logits | Per-label activations (typically a linear output) of shape `[d_0, d_1, ..., d_{r-1}, numClasses]` and dataType of `TFloat16`, `TFloat32`, or `TFloat64`. These activation energies are interpreted as unnormalized log probabilities. |\n|--------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n\n##### Returns\n\n- A `Tensor` of the same shape as `labels` and of the same type as `logits` with the softmax cross entropy loss. \n\n##### Throws\n\n| IllegalArgumentException | If logits are scalars (need to have rank \\\u003e= 1) or if the rank of the labels is not equal to the rank of the logits minus one. |\n|--------------------------|--------------------------------------------------------------------------------------------------------------------------------|"]]