SELU
Stay organized with collections
Save and categorize content based on your preferences.
Scaled Exponential Linear Unit (SELU).
The Scaled Exponential Linear Unit (SELU) activation function is defined as:
if x > 0: return scale * x
if x < 0: return scale * alpha * (exp(x) - 1)
where alpha
and scale
are pre-defined constants (
alpha=1.67326324
and scale=1.05070098
).
Basically, the SELU activation function multiplies scale
(> 1) with the output
of the elu function to ensure a slope larger than one for positive inputs.
The values of alpha
and scale
are chosen so that the mean and
variance of the inputs are preserved between two consecutive layers as long as the weights are
initialized correctly (see LeCun
with Normal
Distribution) and the number of input units is "large enough"
Notes: To be used together with the LeCun
initializer with Normal Distribution.
Public Constructors
|
SELU(Ops tf)
Creates a Scaled Exponential Linear Unit (SELU) activation.
|
Inherited Methods
From class
java.lang.Object
boolean
|
equals(Object arg0)
|
final
Class<?>
|
getClass()
|
int
|
hashCode()
|
final
void
|
notify()
|
final
void
|
notifyAll()
|
String
|
toString()
|
final
void
|
wait(long arg0, int arg1)
|
final
void
|
wait(long arg0)
|
final
void
|
wait()
|
Public Constructors
public
SELU
(Ops tf)
Creates a Scaled Exponential Linear Unit (SELU) activation.
Public Methods
Gets the calculation operation for the activation.
Returns
- The operand for the activation
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2021-11-29 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2021-11-29 UTC."],[],[],null,["# SELU\n\npublic class **SELU** \nScaled Exponential Linear Unit (SELU).\n\nThe Scaled Exponential Linear Unit (SELU) activation function is defined as:\n\n- `if x \u003e 0: return scale * x`\n- `if x \u003c 0: return scale * alpha * (exp(x) - 1)`\n\nwhere `alpha` and `scale` are pre-defined constants (`\nalpha=1.67326324` and `scale=1.05070098`).\n\nBasically, the SELU activation function multiplies `scale` (\\\u003e 1) with the output\nof the elu function to ensure a slope larger than one for positive inputs.\n\nThe values of `alpha` and `scale` are chosen so that the mean and\nvariance of the inputs are preserved between two consecutive layers as long as the weights are\ninitialized correctly (see [LeCun](/jvm/api_docs/java/org/tensorflow/framework/initializers/LeCun) with Normal\nDistribution) and the number of input units is \"large enough\"\n\n**Notes:** To be used together with the [LeCun](/jvm/api_docs/java/org/tensorflow/framework/initializers/LeCun) initializer with Normal Distribution. \n\n##### See Also\n\n- [Klambauer et al., 2017](https://arxiv.org/abs/1706.02515)\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n### Public Constructors\n\n|---|----------------------------------------------------------------------------------------------------------------------------------------------------|\n| | [SELU](/jvm/api_docs/java/org/tensorflow/framework/activations/SELU#SELU(Ops))(Ops tf) Creates a Scaled Exponential Linear Unit (SELU) activation. |\n\n### Public Methods\n\n|-----------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e | [call](/jvm/api_docs/java/org/tensorflow/framework/activations/SELU#call(org.tensorflow.Operand\u003cT\u003e))([Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e input) Gets the calculation operation for the activation. |\n\n### Inherited Methods\n\nFrom class [org.tensorflow.framework.activations.Activation](/jvm/api_docs/java/org/tensorflow/framework/activations/Activation) \n\n|--------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| abstract [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e | [call](/jvm/api_docs/java/org/tensorflow/framework/activations/Activation#call(org.tensorflow.Operand\u003cT\u003e))([Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e input) Gets the calculation operation for the activation. |\n\nFrom class java.lang.Object \n\n|------------------|---------------------------|\n| boolean | equals(Object arg0) |\n| final Class\\\u003c?\\\u003e | getClass() |\n| int | hashCode() |\n| final void | notify() |\n| final void | notifyAll() |\n| String | toString() |\n| final void | wait(long arg0, int arg1) |\n| final void | wait(long arg0) |\n| final void | wait() |\n\nPublic Constructors\n-------------------\n\n#### public\n**SELU**\n(Ops tf)\n\nCreates a Scaled Exponential Linear Unit (SELU) activation. \n\n##### Parameters\n\n| tf | the TensorFlow Ops |\n|----|--------------------|\n\nPublic Methods\n--------------\n\n#### public [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e\n**call**\n([Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e input)\n\nGets the calculation operation for the activation. \n\n##### Parameters\n\n| input | the input tensor |\n|-------|------------------|\n\n##### Returns\n\n- The operand for the activation"]]