tensorflow::
    
   ops::
    
   Softmax
  
  
   #include <nn_ops.h>
  
  Computes softmax activations.
Summary
   For each batch
   
    i
   
   and class
   
    j
   
   we have
   
$$softmax[i, j] = exp(logits[i, j]) / sum_j(exp(logits[i, j]))$$
Args:
- scope: A Scope object
 - 
     logits: 2-D with shape
     
[batch_size, num_classes]. 
Returns:
- 
     
Output: Same shape aslogits. 
     Constructors and Destructors | 
   |
|---|---|
     
      
       Softmax
      
      (const ::
      
       tensorflow::Scope
      
      & scope, ::
      
       tensorflow::Input
      
      logits)
     
      | 
   
     Public attributes | 
   |
|---|---|
     
      
       operation
      
     
     | 
    |
     
      
       softmax
      
     
     | 
    |
     Public functions | 
   |
|---|---|
     
      
       node
      
      () const
     
     | 
    
     
       ::tensorflow::Node *
      
      | 
   
     
      
       operator::tensorflow::Input
      
      () const
     
     | 
    
     
      
      | 
   
     
      
       operator::tensorflow::Output
      
      () const
     
     | 
    
     
      
      | 
   
Public attributes
Public functions
node
::tensorflow::Node * node() const
operator::tensorflow::Input
operator::tensorflow::Input() const
operator::tensorflow::Output
operator::tensorflow::Output() const