ELU
Stay organized with collections
Save and categorize content based on your preferences.
Exponential linear unit.
The exponential linear unit (ELU) with alpha > 0
is:
x
if x > 0
and alpha * (exp(x) -
1)
if x < 0
.
The ELU hyperparameter alpha
controls the value to which an ELU saturates for
negative net inputs. ELUs diminish the vanishing gradient effect.
ELUs have negative values which pushes the mean of the activations closer to zero. Mean
activations that are closer to zero enable faster learning as they bring the gradient closer to
the natural gradient. ELUs saturate to a negative value when the argument gets smaller.
Saturation means a small derivative which decreases the variation and the information that is
propagated to the next layer.
Example Usage:
Operand<TFloat32> input = ...;
ELU<TFloat32> elu = new ELU<>(tf, 2.0f);
Operand<TFloat32> result = elu.call(input);
Public Constructors
|
|
|
ELU(Ops tf, double alpha)
Creates a new ELU
|
Inherited Methods
From class
java.lang.Object
boolean
|
equals(Object arg0)
|
final
Class<?>
|
getClass()
|
int
|
hashCode()
|
final
void
|
notify()
|
final
void
|
notifyAll()
|
String
|
toString()
|
final
void
|
wait(long arg0, int arg1)
|
final
void
|
wait(long arg0)
|
final
void
|
wait()
|
Public Constructors
public
ELU
(Ops tf, double alpha)
Parameters
tf |
the TensorFlow Ops |
alpha |
A scalar, slope of negative section. It controls the value to which an ELU
saturates for negative net inputs.
|
Public Methods
Gets the calculation operation for the activation.
Returns
- The operand for the activation
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2021-11-29 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2021-11-29 UTC."],[],[],null,["# ELU\n\npublic class **ELU** \nExponential linear unit.\n\nThe exponential linear unit (ELU) with `alpha \u003e 0` is:\n\n`x` if `x \u003e 0` and `alpha * (exp(x) -\n1)` if `x \u003c 0`.\n\nThe ELU hyperparameter `alpha` controls the value to which an ELU saturates for\nnegative net inputs. ELUs diminish the vanishing gradient effect.\n\nELUs have negative values which pushes the mean of the activations closer to zero. Mean\nactivations that are closer to zero enable faster learning as they bring the gradient closer to\nthe natural gradient. ELUs saturate to a negative value when the argument gets smaller.\nSaturation means a small derivative which decreases the variation and the information that is\npropagated to the next layer.\n\nExample Usage:\n\n```\n Operand\u003cTFloat32\u003e input = ...;\n ELU\u003cTFloat32\u003e elu = new ELU\u003c\u003e(tf, 2.0f);\n Operand\u003cTFloat32\u003e result = elu.call(input);\n \n```\n\n\u003cbr /\u003e\n\n##### See Also\n\n- [Clevert et al, 2016, Fast and Accurate Deep\n Network Learning by Exponential Linear Units (ELUs)](https://arxiv.org/abs/1511.07289)\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n### Public Constructors\n\n|---|----------------------------------------------------------------------------------------------------------------------------------------------|\n| | [ELU](/jvm/api_docs/java/org/tensorflow/framework/activations/ELU#ELU(Ops))(Ops tf) Creates a new ELU with alpha=[ERROR(/#ALPHA_DEFAULT)](). |\n| | [ELU](/jvm/api_docs/java/org/tensorflow/framework/activations/ELU#ELU(Ops, double))(Ops tf, double alpha) Creates a new ELU |\n\n### Public Methods\n\n|-----------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e | [call](/jvm/api_docs/java/org/tensorflow/framework/activations/ELU#call(org.tensorflow.Operand\u003cT\u003e))([Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e input) Gets the calculation operation for the activation. |\n\n### Inherited Methods\n\nFrom class [org.tensorflow.framework.activations.Activation](/jvm/api_docs/java/org/tensorflow/framework/activations/Activation) \n\n|--------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| abstract [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e | [call](/jvm/api_docs/java/org/tensorflow/framework/activations/Activation#call(org.tensorflow.Operand\u003cT\u003e))([Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e input) Gets the calculation operation for the activation. |\n\nFrom class java.lang.Object \n\n|------------------|---------------------------|\n| boolean | equals(Object arg0) |\n| final Class\\\u003c?\\\u003e | getClass() |\n| int | hashCode() |\n| final void | notify() |\n| final void | notifyAll() |\n| String | toString() |\n| final void | wait(long arg0, int arg1) |\n| final void | wait(long arg0) |\n| final void | wait() |\n\nPublic Constructors\n-------------------\n\n#### public\n**ELU**\n(Ops tf)\n\nCreates a new ELU with alpha=[ERROR(/#ALPHA_DEFAULT)](). \n\n##### Parameters\n\n| tf | the TensorFlow Ops |\n|----|--------------------|\n\n#### public\n**ELU**\n(Ops tf, double alpha)\n\nCreates a new ELU \n\n##### Parameters\n\n| tf | the TensorFlow Ops |\n| alpha | A scalar, slope of negative section. It controls the value to which an ELU saturates for negative net inputs. |\n|-------|---------------------------------------------------------------------------------------------------------------|\n\nPublic Methods\n--------------\n\n#### public [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e\n**call**\n([Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e input)\n\nGets the calculation operation for the activation. \n\n##### Parameters\n\n| input | the input tensor |\n|-------|------------------|\n\n##### Returns\n\n- The operand for the activation"]]