Module: tf.compat.v1.keras.activations
Stay organized with collections
Save and categorize content based on your preferences.
Built-in activation functions.
Functions
deserialize(...)
: Returns activation function given a string identifier.
elu(...)
: Exponential Linear Unit.
exponential(...)
: Exponential activation function.
get(...)
: Returns function.
hard_sigmoid(...)
: Hard sigmoid activation function.
linear(...)
: Linear activation function (pass-through).
relu(...)
: Applies the rectified linear unit activation function.
selu(...)
: Scaled Exponential Linear Unit (SELU).
serialize(...)
: Returns the string identifier of an activation function.
sigmoid(...)
: Sigmoid activation function, sigmoid(x) = 1 / (1 + exp(-x))
.
softmax(...)
: Softmax converts a real vector to a vector of categorical probabilities.
softplus(...)
: Softplus activation function, softplus(x) = log(exp(x) + 1)
.
softsign(...)
: Softsign activation function, softsign(x) = x / (abs(x) + 1)
.
swish(...)
: Swish activation function, swish(x) = x * sigmoid(x)
.
tanh(...)
: Hyperbolic tangent activation function.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2021-02-18 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2021-02-18 UTC."],[],[],null,["# Module: tf.compat.v1.keras.activations\n\n\u003cbr /\u003e\n\nBuilt-in activation functions.\n\nFunctions\n---------\n\n[`deserialize(...)`](../../../../tf/keras/activations/deserialize): Returns activation function given a string identifier.\n\n[`elu(...)`](../../../../tf/keras/activations/elu): Exponential Linear Unit.\n\n[`exponential(...)`](../../../../tf/keras/activations/exponential): Exponential activation function.\n\n[`get(...)`](../../../../tf/keras/activations/get): Returns function.\n\n[`hard_sigmoid(...)`](../../../../tf/keras/activations/hard_sigmoid): Hard sigmoid activation function.\n\n[`linear(...)`](../../../../tf/keras/activations/linear): Linear activation function (pass-through).\n\n[`relu(...)`](../../../../tf/keras/activations/relu): Applies the rectified linear unit activation function.\n\n[`selu(...)`](../../../../tf/keras/activations/selu): Scaled Exponential Linear Unit (SELU).\n\n[`serialize(...)`](../../../../tf/keras/activations/serialize): Returns the string identifier of an activation function.\n\n[`sigmoid(...)`](../../../../tf/keras/activations/sigmoid): Sigmoid activation function, `sigmoid(x) = 1 / (1 + exp(-x))`.\n\n[`softmax(...)`](../../../../tf/keras/activations/softmax): Softmax converts a real vector to a vector of categorical probabilities.\n\n[`softplus(...)`](../../../../tf/keras/activations/softplus): Softplus activation function, `softplus(x) = log(exp(x) + 1)`.\n\n[`softsign(...)`](../../../../tf/keras/activations/softsign): Softsign activation function, `softsign(x) = x / (abs(x) + 1)`.\n\n[`swish(...)`](../../../../tf/keras/activations/swish): Swish activation function, `swish(x) = x * sigmoid(x)`.\n\n[`tanh(...)`](../../../../tf/keras/activations/tanh): Hyperbolic tangent activation function."]]