Scaled Exponential Linear Unit (SELU).
The Scaled Exponential Linear Unit (SELU) activation function is defined as:
if x > 0: return scale * xif x < 0: return scale * alpha * (exp(x) - 1)
where alpha and scale are pre-defined constants (
 alpha=1.67326324 and scale=1.05070098).
 
Basically, the SELU activation function multiplies scale (> 1) with the output
 of the elu function to ensure a slope larger than one for positive inputs.
 
The values of alpha and scale are chosen so that the mean and
 variance of the inputs are preserved between two consecutive layers as long as the weights are
 initialized correctly (see LeCun with Normal
 Distribution) and the number of input units is "large enough"
 
Notes:  To be used together with the LeCun initializer with Normal Distribution.
See Also
Public Constructors
| 
 
SELU(Ops tf)
                
                   
Creates a Scaled Exponential Linear Unit (SELU) activation. 
 | 
Public Methods
| Operand<T> | 
Inherited Methods
Public Constructors
public SELU (Ops tf)
Creates a Scaled Exponential Linear Unit (SELU) activation.
Parameters
| tf | the TensorFlow Ops | 
|---|