View source on GitHub |
Computes elementwise softplus: softplus(x) = log(exp(x) + 1)
.
tf.compat.v1.math.softplus(
features, name=None
)
Used in the notebooks
Used in the tutorials |
---|
softplus
is a smooth approximation of relu
. Like relu
, softplus
always
takes on positive values.
Example:
import tensorflow as tf
tf.math.softplus(tf.range(0, 2, dtype=tf.float32)).numpy()
array([0.6931472, 1.3132616], dtype=float32)
Args | |
---|---|
features
|
Tensor
|
name
|
Optional: name to associate with this operation. |
Returns | |
---|---|
Tensor
|