Swish
Stay organized with collections
Save and categorize content based on your preferences.
Swish activation function. swish(x) = x * sigmoid(x)
.
Swish activation function which returns x*sigmoid(x)
. It is a smooth,
non-monotonic function that consistently matches or outperforms ReLU
on deep
networks, it is unbounded above and bounded below.
Example Usage:
Operand<TFloat32> input = tf.constant(new float[]
{-20, -1.0, 0.0, 1.0, 20});
Swish<TFloat32> swish = new Swish<>(tf);
Operand<TFloat32> result = swish.call(input);
// result = [-4.1223075e-08f, -2.6894143e-01f, 0.0000000e+00f,
// 7.3105860e-01f, 2.0000000e+01f ]
Public Constructors
|
Swish(Ops tf)
Creates a Swish activation, swish(x) = x * sigmoid(x) .
|
Inherited Methods
From class
java.lang.Object
boolean
|
equals(Object arg0)
|
final
Class<?>
|
getClass()
|
int
|
hashCode()
|
final
void
|
notify()
|
final
void
|
notifyAll()
|
String
|
toString()
|
final
void
|
wait(long arg0, int arg1)
|
final
void
|
wait(long arg0)
|
final
void
|
wait()
|
Public Constructors
public
Swish
(Ops tf)
Creates a Swish activation, swish(x) = x * sigmoid(x)
.
Swish activation function which returns x*sigmoid(x)
. It is a smooth,
non-monotonic function that consistently matches or outperforms ReLU on deep networks, it is
unbounded above and bounded below.
Public Methods
Gets the calculation operation for the activation.
Returns
- The operand for the activation
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2021-11-29 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2021-11-29 UTC."],[],[],null,["# Swish\n\npublic class **Swish** \nSwish activation function. `swish(x) = x * sigmoid(x)`.\n\nSwish activation function which returns `x*sigmoid(x)`. It is a smooth,\nnon-monotonic function that consistently matches or outperforms `ReLU` on deep\nnetworks, it is unbounded above and bounded below.\n\nExample Usage:\n\n```\n Operand\u003cTFloat32\u003e input = tf.constant(new float[]\n {-20, -1.0, 0.0, 1.0, 20});\n Swish\u003cTFloat32\u003e swish = new Swish\u003c\u003e(tf);\n Operand\u003cTFloat32\u003e result = swish.call(input);\n // result = [-4.1223075e-08f, -2.6894143e-01f, 0.0000000e+00f,\n // 7.3105860e-01f, 2.0000000e+01f ]\n\n \n```\n\n\u003cbr /\u003e\n\n##### See Also\n\n- [Ramachandran et al., 2017](https://arxiv.org/abs/1710.05941)\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n### Public Constructors\n\n|---|----------------------------------------------------------------------------------------------------------------------------------------------------|\n| | [Swish](/jvm/api_docs/java/org/tensorflow/framework/activations/Swish#Swish(Ops))(Ops tf) Creates a Swish activation, `swish(x) = x * sigmoid(x)`. |\n\n### Public Methods\n\n|-----------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e | [call](/jvm/api_docs/java/org/tensorflow/framework/activations/Swish#call(org.tensorflow.Operand\u003cT\u003e))([Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e input) Gets the calculation operation for the activation. |\n\n### Inherited Methods\n\nFrom class [org.tensorflow.framework.activations.Activation](/jvm/api_docs/java/org/tensorflow/framework/activations/Activation) \n\n|--------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| abstract [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e | [call](/jvm/api_docs/java/org/tensorflow/framework/activations/Activation#call(org.tensorflow.Operand\u003cT\u003e))([Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e input) Gets the calculation operation for the activation. |\n\nFrom class java.lang.Object \n\n|------------------|---------------------------|\n| boolean | equals(Object arg0) |\n| final Class\\\u003c?\\\u003e | getClass() |\n| int | hashCode() |\n| final void | notify() |\n| final void | notifyAll() |\n| String | toString() |\n| final void | wait(long arg0, int arg1) |\n| final void | wait(long arg0) |\n| final void | wait() |\n\nPublic Constructors\n-------------------\n\n#### public\n**Swish**\n(Ops tf)\n\nCreates a Swish activation, `swish(x) = x * sigmoid(x)`.\n\nSwish activation function which returns `x*sigmoid(x)`. It is a smooth,\nnon-monotonic function that consistently matches or outperforms ReLU on deep networks, it is\nunbounded above and bounded below.\n\n\u003cbr /\u003e\n\n##### Parameters\n\n| tf | the TensorFlow Ops |\n|----|--------------------|\n\nPublic Methods\n--------------\n\n#### public [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e\n**call**\n([Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e input)\n\nGets the calculation operation for the activation. \n\n##### Parameters\n\n| input | the input tensor |\n|-------|------------------|\n\n##### Returns\n\n- The operand for the activation"]]