ResourceApplyAdadelta
Stay organized with collections
Save and categorize content based on your preferences.
Update '*var' according to the adadelta scheme.
accum = rho() * accum + (1 - rho()) * grad.square();
update = (update_accum + epsilon).sqrt() * (accum + epsilon()).rsqrt() * grad;
update_accum = rho() * update_accum + (1 - rho()) * update.square();
var -= update;
Constants
String |
OP_NAME |
The name of this op, as known by TensorFlow core engine |
Inherited Methods
From class
java.lang.Object
boolean
|
equals(Object arg0)
|
final
Class<?>
|
getClass()
|
int
|
hashCode()
|
final
void
|
notify()
|
final
void
|
notifyAll()
|
String
|
toString()
|
final
void
|
wait(long arg0, int arg1)
|
final
void
|
wait(long arg0)
|
final
void
|
wait()
|
Constants
public
static
final
String
OP_NAME
The name of this op, as known by TensorFlow core engine
Constant Value:
"ResourceApplyAdadelta"
Public Methods
Factory method to create a class wrapping a new ResourceApplyAdadelta operation.
Parameters
scope |
current scope |
var |
Should be from a Variable(). |
accum |
Should be from a Variable(). |
accumUpdate |
Should be from a Variable(). |
lr |
Scaling factor. Must be a scalar. |
rho |
Decay factor. Must be a scalar. |
epsilon |
Constant factor. Must be a scalar. |
grad |
The gradient. |
options |
carries optional attributes values |
Returns
- a new instance of ResourceApplyAdadelta
Parameters
useLocking |
If True, updating of the var, accum and update_accum tensors will be protected by
a lock; otherwise the behavior is undefined, but may exhibit less contention.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2021-11-29 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2021-11-29 UTC."],[],[],null,["# ResourceApplyAdadelta\n\npublic final class **ResourceApplyAdadelta** \nUpdate '\\*var' according to the adadelta scheme.\n\n\naccum = rho() \\* accum + (1 - rho()) \\* grad.square();\nupdate = (update_accum + epsilon).sqrt() \\* (accum + epsilon()).rsqrt() \\* grad;\nupdate_accum = rho() \\* update_accum + (1 - rho()) \\* update.square();\nvar -= update;\n\n\u003cbr /\u003e\n\n### Nested Classes\n\n|-------|---|---|-------------------------------------------------------------------------------------------------------------------|\n| class | [ResourceApplyAdadelta.Options](/jvm/api_docs/java/org/tensorflow/op/train/ResourceApplyAdadelta.Options) || Optional attributes for [ResourceApplyAdadelta](/jvm/api_docs/java/org/tensorflow/op/train/ResourceApplyAdadelta) |\n\n### Constants\n\n|--------|-------------------------------------------------------------------------------------|---------------------------------------------------------|\n| String | [OP_NAME](/jvm/api_docs/java/org/tensorflow/op/train/ResourceApplyAdadelta#OP_NAME) | The name of this op, as known by TensorFlow core engine |\n\n### Public Methods\n\n|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| static \\\u003cT extends [TType](/jvm/api_docs/java/org/tensorflow/types/family/TType)\\\u003e [ResourceApplyAdadelta](/jvm/api_docs/java/org/tensorflow/op/train/ResourceApplyAdadelta) | [create](/jvm/api_docs/java/org/tensorflow/op/train/ResourceApplyAdadelta#create(org.tensorflow.op.Scope, org.tensorflow.Operand\u003c?\u003e, org.tensorflow.Operand\u003c?\u003e, org.tensorflow.Operand\u003c?\u003e, org.tensorflow.Operand\u003cT\u003e, org.tensorflow.Operand\u003cT\u003e, org.tensorflow.Operand\u003cT\u003e, org.tensorflow.Operand\u003cT\u003e, org.tensorflow.op.train.ResourceApplyAdadelta.Options...))([Scope](/jvm/api_docs/java/org/tensorflow/op/Scope) scope, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003c?\\\u003e var, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003c?\\\u003e accum, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003c?\\\u003e accumUpdate, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e lr, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e rho, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e epsilon, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e grad, [Options...](/jvm/api_docs/java/org/tensorflow/op/train/ResourceApplyAdadelta.Options) options) Factory method to create a class wrapping a new ResourceApplyAdadelta operation. |\n| static [ResourceApplyAdadelta.Options](/jvm/api_docs/java/org/tensorflow/op/train/ResourceApplyAdadelta.Options) | [useLocking](/jvm/api_docs/java/org/tensorflow/op/train/ResourceApplyAdadelta#useLocking(java.lang.Boolean))(Boolean useLocking) |\n\n### Inherited Methods\n\nFrom class [org.tensorflow.op.RawOp](/jvm/api_docs/java/org/tensorflow/op/RawOp) \n\n|----------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| final boolean | [equals](/jvm/api_docs/java/org/tensorflow/op/RawOp#equals(java.lang.Object))(Object obj) |\n| final int | [hashCode](/jvm/api_docs/java/org/tensorflow/op/RawOp#hashCode())() |\n| [Operation](/jvm/api_docs/java/org/tensorflow/Operation) | [op](/jvm/api_docs/java/org/tensorflow/op/RawOp#op())() Return this unit of computation as a single [Operation](/jvm/api_docs/java/org/tensorflow/Operation). |\n| final String | [toString](/jvm/api_docs/java/org/tensorflow/op/RawOp#toString())() |\n\nFrom class java.lang.Object \n\n|------------------|---------------------------|\n| boolean | equals(Object arg0) |\n| final Class\\\u003c?\\\u003e | getClass() |\n| int | hashCode() |\n| final void | notify() |\n| final void | notifyAll() |\n| String | toString() |\n| final void | wait(long arg0, int arg1) |\n| final void | wait(long arg0) |\n| final void | wait() |\n\nFrom interface [org.tensorflow.op.Op](/jvm/api_docs/java/org/tensorflow/op/Op) \n\n|-----------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| abstract [ExecutionEnvironment](/jvm/api_docs/java/org/tensorflow/ExecutionEnvironment) | [env](/jvm/api_docs/java/org/tensorflow/op/Op#env())() Return the execution environment this op was created in. |\n| abstract [Operation](/jvm/api_docs/java/org/tensorflow/Operation) | [op](/jvm/api_docs/java/org/tensorflow/op/Op#op())() Return this unit of computation as a single [Operation](/jvm/api_docs/java/org/tensorflow/Operation). |\n\nConstants\n---------\n\n#### public static final String\n**OP_NAME**\n\nThe name of this op, as known by TensorFlow core engine \nConstant Value: \"ResourceApplyAdadelta\"\n\nPublic Methods\n--------------\n\n#### public static [ResourceApplyAdadelta](/jvm/api_docs/java/org/tensorflow/op/train/ResourceApplyAdadelta)\n**create**\n([Scope](/jvm/api_docs/java/org/tensorflow/op/Scope) scope, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003c?\\\u003e var, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003c?\\\u003e accum, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003c?\\\u003e accumUpdate, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e lr, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e rho, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e epsilon, [Operand](/jvm/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e grad, [Options...](/jvm/api_docs/java/org/tensorflow/op/train/ResourceApplyAdadelta.Options) options)\n\nFactory method to create a class wrapping a new ResourceApplyAdadelta operation. \n\n##### Parameters\n\n| scope | current scope |\n| var | Should be from a Variable(). |\n| accum | Should be from a Variable(). |\n| accumUpdate | Should be from a Variable(). |\n| lr | Scaling factor. Must be a scalar. |\n| rho | Decay factor. Must be a scalar. |\n| epsilon | Constant factor. Must be a scalar. |\n| grad | The gradient. |\n| options | carries optional attributes values |\n|-------------|------------------------------------|\n\n##### Returns\n\n- a new instance of ResourceApplyAdadelta \n\n#### public static [ResourceApplyAdadelta.Options](/jvm/api_docs/java/org/tensorflow/op/train/ResourceApplyAdadelta.Options)\n**useLocking**\n(Boolean useLocking)\n\n\u003cbr /\u003e\n\n##### Parameters\n\n| useLocking | If True, updating of the var, accum and update_accum tensors will be protected by a lock; otherwise the behavior is undefined, but may exhibit less contention. |\n|------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------|"]]