Warning: This API is deprecated and will be removed in a future
version of TensorFlow after
the replacement is stable.
ResourceApplyAdagradV2
Stay organized with collections
Save and categorize content based on your preferences.
Update '*var' according to the adagrad scheme.
accum += grad * grad
var -= lr * grad * (1 / (sqrt(accum) + epsilon))
Inherited Methods
From class
java.lang.Object
boolean
|
equals(Object arg0)
|
final
Class<?>
|
getClass()
|
int
|
hashCode()
|
final
void
|
notify()
|
final
void
|
notifyAll()
|
String
|
toString()
|
final
void
|
wait(long arg0, int arg1)
|
final
void
|
wait(long arg0)
|
final
void
|
wait()
|
Public Methods
Factory method to create a class wrapping a new ResourceApplyAdagradV2 operation.
Parameters
scope |
current scope |
var |
Should be from a Variable(). |
accum |
Should be from a Variable(). |
lr |
Scaling factor. Must be a scalar. |
epsilon |
Constant factor. Must be a scalar. |
grad |
The gradient. |
options |
carries optional attributes values |
Returns
- a new instance of ResourceApplyAdagradV2
Parameters
useLocking |
If `True`, updating of the var and accum tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2022-02-12 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2022-02-12 UTC."],[],[],null,["# ResourceApplyAdagradV2\n\npublic final class **ResourceApplyAdagradV2** \nUpdate '\\*var' according to the adagrad scheme.\n\n\naccum += grad \\* grad\nvar -= lr \\* grad \\* (1 / (sqrt(accum) + epsilon))\n\n\u003cbr /\u003e\n\n### Nested Classes\n\n|-------|---|---|----------------------------------------------------------------------------------------------------------------|\n| class | [ResourceApplyAdagradV2.Options](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdagradV2.Options) || Optional attributes for [ResourceApplyAdagradV2](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdagradV2) |\n\n### Public Methods\n\n|---------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| static \\\u003cT\\\u003e [ResourceApplyAdagradV2](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdagradV2) | [create](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdagradV2#create(org.tensorflow.op.Scope,%20org.tensorflow.Operand\u003c?\u003e,%20org.tensorflow.Operand\u003c?\u003e,%20org.tensorflow.Operand\u003cT\u003e,%20org.tensorflow.Operand\u003cT\u003e,%20org.tensorflow.Operand\u003cT\u003e,%20org.tensorflow.op.core.ResourceApplyAdagradV2.Options...))([Scope](/api_docs/java/org/tensorflow/op/Scope) scope, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003c?\\\u003e var, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003c?\\\u003e accum, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e lr, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e epsilon, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e grad, [Options...](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdagradV2.Options) options) Factory method to create a class wrapping a new ResourceApplyAdagradV2 operation. |\n| static [ResourceApplyAdagradV2.Options](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdagradV2.Options) | [updateSlots](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdagradV2#updateSlots(java.lang.Boolean))(Boolean updateSlots) |\n| static [ResourceApplyAdagradV2.Options](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdagradV2.Options) | [useLocking](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdagradV2#useLocking(java.lang.Boolean))(Boolean useLocking) |\n\n### Inherited Methods\n\nFrom class [org.tensorflow.op.PrimitiveOp](/api_docs/java/org/tensorflow/op/PrimitiveOp) \n\n|------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------|\n| final boolean | [equals](/api_docs/java/org/tensorflow/op/PrimitiveOp#equals(java.lang.Object))(Object obj) |\n| final int | [hashCode](/api_docs/java/org/tensorflow/op/PrimitiveOp#hashCode())() |\n| [Operation](/api_docs/java/org/tensorflow/Operation) | [op](/api_docs/java/org/tensorflow/op/PrimitiveOp#op())() Returns the underlying [Operation](/api_docs/java/org/tensorflow/Operation) |\n| final String | [toString](/api_docs/java/org/tensorflow/op/PrimitiveOp#toString())() |\n\nFrom class java.lang.Object \n\n|------------------|---------------------------|\n| boolean | equals(Object arg0) |\n| final Class\\\u003c?\\\u003e | getClass() |\n| int | hashCode() |\n| final void | notify() |\n| final void | notifyAll() |\n| String | toString() |\n| final void | wait(long arg0, int arg1) |\n| final void | wait(long arg0) |\n| final void | wait() |\n\nPublic Methods\n--------------\n\n#### public static [ResourceApplyAdagradV2](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdagradV2)\n**create**\n([Scope](/api_docs/java/org/tensorflow/op/Scope) scope, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003c?\\\u003e var, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003c?\\\u003e accum, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e lr, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e epsilon, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e grad, [Options...](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdagradV2.Options) options)\n\nFactory method to create a class wrapping a new ResourceApplyAdagradV2 operation. \n\n##### Parameters\n\n| scope | current scope |\n| var | Should be from a Variable(). |\n| accum | Should be from a Variable(). |\n| lr | Scaling factor. Must be a scalar. |\n| epsilon | Constant factor. Must be a scalar. |\n| grad | The gradient. |\n| options | carries optional attributes values |\n|---------|------------------------------------|\n\n##### Returns\n\n- a new instance of ResourceApplyAdagradV2 \n\n#### public static [ResourceApplyAdagradV2.Options](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdagradV2.Options)\n**updateSlots**\n(Boolean updateSlots)\n\n\u003cbr /\u003e\n\n#### public static [ResourceApplyAdagradV2.Options](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdagradV2.Options)\n**useLocking**\n(Boolean useLocking)\n\n\u003cbr /\u003e\n\n##### Parameters\n\n| useLocking | If \\`True\\`, updating of the var and accum tensors will be protected by a lock; otherwise the behavior is undefined, but may exhibit less contention. |\n|------------|-------------------------------------------------------------------------------------------------------------------------------------------------------|"]]