tensorflow::
    
   ops::
    
   SparseSoftmaxCrossEntropyWithLogits
  
  
   #include <nn_ops.h>
  
  Computes softmax cross entropy cost and gradients to backpropagate.
Summary
   Unlike
   
    
     SoftmaxCrossEntropyWithLogits
    
   
   , this operation does not accept a matrix of label probabilities, but rather a single label per row of features. This label is considered to have probability 1.0 for the given row.
  
Inputs are the logits, not probabilities.
Args:
- scope: A Scope object
 - features: batch_size x num_classes matrix
 - labels: batch_size vector with values in [0, num_classes). This is the label for the given minibatch entry.
 
Returns:
- 
     
Outputloss: Per example loss (batch_size vector). - 
     
Outputbackprop: backpropagated gradients (batch_size x num_classes matrix). 
     Constructors and Destructors | 
   |
|---|---|
     
      
       SparseSoftmaxCrossEntropyWithLogits
      
      (const ::
      
       tensorflow::Scope
      
      & scope, ::
      
       tensorflow::Input
      
      features, ::
      
       tensorflow::Input
      
      labels)
     
      | 
   
     Public attributes | 
   |
|---|---|
     
      
       backprop
      
     
     | 
    |
     
      
       loss
      
     
     | 
    |
     
      
       operation
      
     
     | 
    |
Public attributes
Public functions
SparseSoftmaxCrossEntropyWithLogits
SparseSoftmaxCrossEntropyWithLogits( const ::tensorflow::Scope & scope, ::tensorflow::Input features, ::tensorflow::Input labels )