Load FTRL embedding parameters.
An op that loads optimization parameters into HBM for embedding. Must be preceded by a ConfigureTPUEmbeddingHost op that sets up the correct embedding table configuration. For example, this op is used to install parameters that are loaded from a checkpoint before a training loop is executed.
Nested Classes
class | LoadTPUEmbeddingFTRLParameters.Options | Optional attributes for LoadTPUEmbeddingFTRLParameters
|
Constants
String | OP_NAME | The name of this op, as known by TensorFlow core engine |
Public Methods
static LoadTPUEmbeddingFTRLParameters.Options |
config(String config)
|
static LoadTPUEmbeddingFTRLParameters | |
static LoadTPUEmbeddingFTRLParameters.Options |
tableId(Long tableId)
|
static LoadTPUEmbeddingFTRLParameters.Options |
tableName(String tableName)
|
Inherited Methods
boolean |
equals(Object arg0)
|
final Class<?> |
getClass()
|
int |
hashCode()
|
final void |
notify()
|
final void |
notifyAll()
|
String |
toString()
|
final void |
wait(long arg0, int arg1)
|
final void |
wait(long arg0)
|
final void |
wait()
|
abstract ExecutionEnvironment |
env()
Return the execution environment this op was created in.
|
abstract Operation |
Constants
public
static
final
String
OP_NAME
The name of this op, as known by TensorFlow core engine
Public Methods
public static LoadTPUEmbeddingFTRLParameters create (Scope scope, Operand<TFloat32> parameters, Operand<TFloat32> accumulators, Operand<TFloat32> linears, Long numShards, Long shardId, Options... options)
Factory method to create a class wrapping a new LoadTPUEmbeddingFTRLParameters operation.
Parameters
scope | current scope |
---|---|
parameters | Value of parameters used in the FTRL optimization algorithm. |
accumulators | Value of accumulators used in the FTRL optimization algorithm. |
linears | Value of linears used in the FTRL optimization algorithm. |
options | carries optional attributes values |
Returns
- a new instance of LoadTPUEmbeddingFTRLParameters