Warning: This API is deprecated and will be removed in a future
version of TensorFlow after
the replacement is stable.
ResourceApplyAdamWithAmsgrad
Stay organized with collections
Save and categorize content based on your preferences.
Update '*var' according to the Adam algorithm.
$$\text{lr}_t := \mathrm{learning_rate} * \sqrt{1 - \beta_2^t} / (1 - \beta_1^t)$$
$$m_t := \beta_1 * m_{t-1} + (1 - \beta_1) * g$$
$$v_t := \beta_2 * v_{t-1} + (1 - \beta_2) * g * g$$
$$\hat{v}_t := max{\hat{v}_{t-1}, v_t}$$
$$\text{variable} := \text{variable} - \text{lr}_t * m_t / (\sqrt{\hat{v}_t} + \epsilon)$$
Public Methods
static
<T>
ResourceApplyAdamWithAmsgrad
|
create( Scope scope, Operand<?> var, Operand<?> m, Operand<?> v, Operand<?> vhat, Operand<T> beta1Power, Operand<T> beta2Power, Operand<T> lr, Operand<T> beta1, Operand<T> beta2, Operand<T> epsilon, Operand<T> grad, Options... options)
Factory method to create a class wrapping a new ResourceApplyAdamWithAmsgrad operation.
|
static
ResourceApplyAdamWithAmsgrad.Options
|
|
Inherited Methods
From class
java.lang.Object
boolean
|
equals(Object arg0)
|
final
Class<?>
|
getClass()
|
int
|
hashCode()
|
final
void
|
notify()
|
final
void
|
notifyAll()
|
String
|
toString()
|
final
void
|
wait(long arg0, int arg1)
|
final
void
|
wait(long arg0)
|
final
void
|
wait()
|
Public Methods
Factory method to create a class wrapping a new ResourceApplyAdamWithAmsgrad operation.
Parameters
scope |
current scope |
var |
Should be from a Variable(). |
m |
Should be from a Variable(). |
v |
Should be from a Variable(). |
vhat |
Should be from a Variable(). |
beta1Power |
Must be a scalar. |
beta2Power |
Must be a scalar. |
lr |
Scaling factor. Must be a scalar. |
beta1 |
Momentum factor. Must be a scalar. |
beta2 |
Momentum factor. Must be a scalar. |
epsilon |
Ridge term. Must be a scalar. |
grad |
The gradient. |
options |
carries optional attributes values |
Returns
- a new instance of ResourceApplyAdamWithAmsgrad
Parameters
useLocking |
If `True`, updating of the var, m, and v tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2022-02-12 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2022-02-12 UTC."],[],[],null,["# ResourceApplyAdamWithAmsgrad\n\npublic final class **ResourceApplyAdamWithAmsgrad** \nUpdate '\\*var' according to the Adam algorithm.\n\n\n$$\\\\text{lr}_t := \\\\mathrm{learning_rate} \\* \\\\sqrt{1 - \\\\beta_2\\^t} / (1 - \\\\beta_1\\^t)$$\n$$m_t := \\\\beta_1 \\* m_{t-1} + (1 - \\\\beta_1) \\* g$$\n$$v_t := \\\\beta_2 \\* v_{t-1} + (1 - \\\\beta_2) \\* g \\* g$$\n$$\\\\hat{v}_t := max{\\\\hat{v}_{t-1}, v_t}$$\n$$\\\\text{variable} := \\\\text{variable} - \\\\text{lr}_t \\* m_t / (\\\\sqrt{\\\\hat{v}_t} + \\\\epsilon)$$\n\n\u003cbr /\u003e\n\n### Nested Classes\n\n|-------|---|---|----------------------------------------------------------------------------------------------------------------------------|\n| class | [ResourceApplyAdamWithAmsgrad.Options](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdamWithAmsgrad.Options) || Optional attributes for [ResourceApplyAdamWithAmsgrad](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdamWithAmsgrad) |\n\n### Public Methods\n\n|---------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n| static \\\u003cT\\\u003e [ResourceApplyAdamWithAmsgrad](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdamWithAmsgrad) | [create](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdamWithAmsgrad#create(org.tensorflow.op.Scope,%20org.tensorflow.Operand\u003c?\u003e,%20org.tensorflow.Operand\u003c?\u003e,%20org.tensorflow.Operand\u003c?\u003e,%20org.tensorflow.Operand\u003c?\u003e,%20org.tensorflow.Operand\u003cT\u003e,%20org.tensorflow.Operand\u003cT\u003e,%20org.tensorflow.Operand\u003cT\u003e,%20org.tensorflow.Operand\u003cT\u003e,%20org.tensorflow.Operand\u003cT\u003e,%20org.tensorflow.Operand\u003cT\u003e,%20org.tensorflow.Operand\u003cT\u003e,%20org.tensorflow.op.core.ResourceApplyAdamWithAmsgrad.Options...))([Scope](/api_docs/java/org/tensorflow/op/Scope) scope, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003c?\\\u003e var, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003c?\\\u003e m, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003c?\\\u003e v, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003c?\\\u003e vhat, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e beta1Power, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e beta2Power, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e lr, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e beta1, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e beta2, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e epsilon, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e grad, [Options...](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdamWithAmsgrad.Options) options) Factory method to create a class wrapping a new ResourceApplyAdamWithAmsgrad operation. |\n| static [ResourceApplyAdamWithAmsgrad.Options](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdamWithAmsgrad.Options) | [useLocking](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdamWithAmsgrad#useLocking(java.lang.Boolean))(Boolean useLocking) |\n\n### Inherited Methods\n\nFrom class [org.tensorflow.op.PrimitiveOp](/api_docs/java/org/tensorflow/op/PrimitiveOp) \n\n|------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------|\n| final boolean | [equals](/api_docs/java/org/tensorflow/op/PrimitiveOp#equals(java.lang.Object))(Object obj) |\n| final int | [hashCode](/api_docs/java/org/tensorflow/op/PrimitiveOp#hashCode())() |\n| [Operation](/api_docs/java/org/tensorflow/Operation) | [op](/api_docs/java/org/tensorflow/op/PrimitiveOp#op())() Returns the underlying [Operation](/api_docs/java/org/tensorflow/Operation) |\n| final String | [toString](/api_docs/java/org/tensorflow/op/PrimitiveOp#toString())() |\n\nFrom class java.lang.Object \n\n|------------------|---------------------------|\n| boolean | equals(Object arg0) |\n| final Class\\\u003c?\\\u003e | getClass() |\n| int | hashCode() |\n| final void | notify() |\n| final void | notifyAll() |\n| String | toString() |\n| final void | wait(long arg0, int arg1) |\n| final void | wait(long arg0) |\n| final void | wait() |\n\nPublic Methods\n--------------\n\n#### public static [ResourceApplyAdamWithAmsgrad](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdamWithAmsgrad)\n**create**\n([Scope](/api_docs/java/org/tensorflow/op/Scope) scope, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003c?\\\u003e var, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003c?\\\u003e m, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003c?\\\u003e v, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003c?\\\u003e vhat, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e beta1Power, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e beta2Power, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e lr, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e beta1, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e beta2, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e epsilon, [Operand](/api_docs/java/org/tensorflow/Operand)\\\u003cT\\\u003e grad, [Options...](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdamWithAmsgrad.Options) options)\n\nFactory method to create a class wrapping a new ResourceApplyAdamWithAmsgrad operation. \n\n##### Parameters\n\n| scope | current scope |\n| var | Should be from a Variable(). |\n| m | Should be from a Variable(). |\n| v | Should be from a Variable(). |\n| vhat | Should be from a Variable(). |\n| beta1Power | Must be a scalar. |\n| beta2Power | Must be a scalar. |\n| lr | Scaling factor. Must be a scalar. |\n| beta1 | Momentum factor. Must be a scalar. |\n| beta2 | Momentum factor. Must be a scalar. |\n| epsilon | Ridge term. Must be a scalar. |\n| grad | The gradient. |\n| options | carries optional attributes values |\n|------------|------------------------------------|\n\n##### Returns\n\n- a new instance of ResourceApplyAdamWithAmsgrad \n\n#### public static [ResourceApplyAdamWithAmsgrad.Options](/api_docs/java/org/tensorflow/op/core/ResourceApplyAdamWithAmsgrad.Options)\n**useLocking**\n(Boolean useLocking)\n\n\u003cbr /\u003e\n\n##### Parameters\n\n| useLocking | If \\`True\\`, updating of the var, m, and v tensors will be protected by a lock; otherwise the behavior is undefined, but may exhibit less contention. |\n|------------|-------------------------------------------------------------------------------------------------------------------------------------------------------|"]]