View source on GitHub |
Exponential Linear Unit.
tf.keras.activations.elu(
x, alpha=1.0
)
The exponential linear unit (ELU) with alpha > 0
is define as:
x
ifx > 0
- alpha *
exp(x) - 1
ifx < 0
ELUs have negative values which pushes the mean of the activations closer to zero.
Mean activations that are closer to zero enable faster learning as they bring the gradient closer to the natural gradient. ELUs saturate to a negative value when the argument gets smaller. Saturation means a small derivative which decreases the variation and the information that is propagated to the next layer.
Args | |
---|---|
x
|
Input tensor. |