ReLU
Stay organized with collections
Save and categorize content based on your preferences.
Rectified Linear Unit(ReLU) activation.
With default values, this returns the standard ReLU activation: max(x, 0)
, the
element-wise maximum of 0 and the input tensor.
Modifying default parameters allows you to use non-zero thresholds, change the max value of
the activation, and to use a non-zero multiple of the input for values below the threshold.
For example:
Operand<TFloat32> input = tf.constant(
new float[] {-10f, -5f, 0.0f, 5f, 10f});
// With default parameters
ReLU<TFloat32> relu = new ReLU<>(tf);
Operand<TFloat32> result = relu.call(input);
// result is [0.f, 0.f, 0.f, 5.f, 10.f]
// With alpha = 0.5
relu = new ReLU<>(tf, 0.5f, ReLU.MAX_VALUE_DEFAULT, ReLU.THRESHOLD_DEFAULT);
result = relu.call(input);
// result is [-5.f , -2.5f, 0.f , 5.f , 10.f]
// With maxValue = 5
relu = new ReLU<>(tf, ReLU.ALPHA_DEFAULT, 5f, ReLU.THRESHOLD_DEFAULT);
result = relu.call(input);
// result is [0.f, 0.f, 0.f, 5.f, 5.f]
// With threshold = 5
relu = new ReLU<>(tf, ReLU.ALPHA_DEFAULT, ReLU.MAX_VALUE_DEFAULT, 5f);
result = relu.call(input);
// result is [-0.f, -0.f, 0.f, 0.f, 10.f]
Public Constructors
|
|
|
ReLU(Ops tf, float alpha, float maxValue, float threshold)
Creates a new ReLU
|
Inherited Methods
From class
java.lang.Object
boolean
|
equals(Object arg0)
|
final
Class<?>
|
getClass()
|
int
|
hashCode()
|
final
void
|
notify()
|
final
void
|
notifyAll()
|
String
|
toString()
|
final
void
|
wait(long arg0, int arg1)
|
final
void
|
wait(long arg0)
|
final
void
|
wait()
|
Constants
public
static
final
float
ALPHA_DEFAULT
public
static
final
float
MAX_VALUE_DEFAULT
public
static
final
float
THRESHOLD_DEFAULT
Public Constructors
public
ReLU
(Ops tf, float alpha, float maxValue, float threshold)
Parameters
tf |
the TensorFlow Ops |
alpha |
governs the slope for values lower than the threshold. |
maxValue |
sets the saturation threshold (the largest value the function will return). |
threshold |
the threshold value of the activation function below which values will be
damped or set to zero.
|
Public Methods
Gets the calculation operation for the activation.
Returns
- The operand for the activation
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2021-11-29 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2021-11-29 UTC."],[],[],null,["# ReLU\n\npublic class **ReLU** \nRectified Linear Unit(ReLU) activation.\n\nWith default values, this returns the standard ReLU activation: `max(x, 0)`, the\nelement-wise maximum of 0 and the input tensor.\n\nModifying default parameters allows you to use non-zero thresholds, change the max value of\nthe activation, and to use a non-zero multiple of the input for values below the threshold.\n\nFor example:\n\n```\n Operand\u003cTFloat32\u003e input = tf.constant(\n new float[] {-10f, -5f, 0.0f, 5f, 10f});\n\n // With default parameters\n ReLU\u003cTFloat32\u003e relu = new ReLU\u003c\u003e(tf);\n Operand\u003cTFloat32\u003e result = relu.call(input);\n // result is [0.f, 0.f, 0.f, 5.f, 10.f]\n\n // With alpha = 0.5\n relu = new ReLU\u003c\u003e(tf, 0.5f, ReLU.MAX_VALUE_DEFAULT, ReLU.THRESHOLD_DEFAULT);\n result = relu.call(input);\n // result is [-5.f , -2.5f, 0.f , 5.f , 10.f]\n\n // With maxValue = 5\n relu = new ReLU\u003c\u003e(tf, ReLU.ALPHA_DEFAULT, 5f, ReLU.THRESHOLD_DEFAULT);\n result = relu.call(input);\n // result is [0.f, 0.f, 0.f, 5.f, 5.f]\n\n // With threshold = 5\n relu = new ReLU\u003c\u003e(tf, ReLU.ALPHA_DEFAULT, ReLU.MAX_VALUE_DEFAULT, 5f);\n result = relu.call(input);\n // result is [-0.f, -0.f, 0.f, 0.f, 10.f]\n \n```\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n### Constants\n\n|-------|-----------------------------------------------------------------------------------------------------|---|\n| float | [ALPHA_DEFAULT](/jvm/api_docs/java/org/tensorflow/framework/activations/ReLU#ALPHA_DEFAULT) | |\n| float | [MAX_VALUE_DEFAULT](/jvm/api_docs/java/org/tensorflow/framework/activations/ReLU#MAX_VALUE_DEFAULT) | |\n| float | [THRESHOLD_DEFAULT](/jvm/api_docs/java/org/tensorflow/framework/activations/ReLU#THRESHOLD_DEFAULT) | |\n\n### Public Constructors\n\n|---|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| | [ReLU](/jvm/api_docs/java/org/tensorflow/framework/activations/ReLU#ReLU(Ops))(Ops tf) Creates a new ReLU with alpha=[ALPHA_DEFAULT](/jvm/api_docs/java/org/tensorflow/framework/activations/ReLU#ALPHA_DEFAULT), maxValue=[MAX_VALUE_DEFAULT](/jvm/api_docs/java/org/tensorflow/framework/activations/ReLU#MAX_VALUE_DEFAULT), threshold=[THRESHOLD_DEFAULT](/jvm/api_docs/java/org/tensorflow/framework/activations/ReLU#THRESHOLD_DEFAULT), |\n| | [ReLU](/jvm/api_docs/java/org/tensorflow/framework/activations/ReLU#ReLU(Ops, float, float, float))(Ops tf, float alpha, float maxValue, float threshold) Creates a new ReLU |\n\n### Public Methods\n\n|-----------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e | [call](/jvm/api_docs/java/org/tensorflow/framework/activations/ReLU#call(org.tensorflow.Operand\u003cT\u003e))([Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e input) Gets the calculation operation for the activation. |\n\n### Inherited Methods\n\nFrom class [org.tensorflow.framework.activations.Activation](/jvm/api_docs/java/org/tensorflow/framework/activations/Activation) \n\n|--------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| abstract [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e | [call](/jvm/api_docs/java/org/tensorflow/framework/activations/Activation#call(org.tensorflow.Operand\u003cT\u003e))([Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e input) Gets the calculation operation for the activation. |\n\nFrom class java.lang.Object \n\n|------------------|---------------------------|\n| boolean | equals(Object arg0) |\n| final Class\\\u003c?\\\u003e | getClass() |\n| int | hashCode() |\n| final void | notify() |\n| final void | notifyAll() |\n| String | toString() |\n| final void | wait(long arg0, int arg1) |\n| final void | wait(long arg0) |\n| final void | wait() |\n\nConstants\n---------\n\n#### public static final float\n**ALPHA_DEFAULT**\n\n\u003cbr /\u003e\n\nConstant Value: 0.0 \n\n#### public static final float\n**MAX_VALUE_DEFAULT**\n\n\u003cbr /\u003e\n\nConstant Value: NaN \n\n#### public static final float\n**THRESHOLD_DEFAULT**\n\n\u003cbr /\u003e\n\nConstant Value: 0.0\n\nPublic Constructors\n-------------------\n\n#### public\n**ReLU**\n(Ops tf)\n\nCreates a new ReLU with alpha=[ALPHA_DEFAULT](/jvm/api_docs/java/org/tensorflow/framework/activations/ReLU#ALPHA_DEFAULT), maxValue=[MAX_VALUE_DEFAULT](/jvm/api_docs/java/org/tensorflow/framework/activations/ReLU#MAX_VALUE_DEFAULT),\nthreshold=[THRESHOLD_DEFAULT](/jvm/api_docs/java/org/tensorflow/framework/activations/ReLU#THRESHOLD_DEFAULT), \n\n##### Parameters\n\n| tf | the TensorFlow Ops |\n|----|--------------------|\n\n#### public\n**ReLU**\n(Ops tf, float alpha, float maxValue, float threshold)\n\nCreates a new ReLU \n\n##### Parameters\n\n| tf | the TensorFlow Ops |\n| alpha | governs the slope for values lower than the threshold. |\n| maxValue | sets the saturation threshold (the largest value the function will return). |\n| threshold | the threshold value of the activation function below which values will be damped or set to zero. |\n|-----------|--------------------------------------------------------------------------------------------------|\n\nPublic Methods\n--------------\n\n#### public [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e\n**call**\n([Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e input)\n\nGets the calculation operation for the activation. \n\n##### Parameters\n\n| input | the input tensor |\n|-------|------------------|\n\n##### Returns\n\n- The operand for the activation"]]