Load FTRL embedding parameters with debug support.
An op that loads optimization parameters into HBM for embedding. Must be preceded by a ConfigureTPUEmbeddingHost op that sets up the correct embedding table configuration. For example, this op is used to install parameters that are loaded from a checkpoint before a training loop is executed.
Nested Classes
| class | LoadTPUEmbeddingFTRLParametersGradAccumDebug.Options | Optional attributes for LoadTPUEmbeddingFTRLParametersGradAccumDebug | |
Constants
| String | OP_NAME | The name of this op, as known by TensorFlow core engine | 
Public Methods
| static LoadTPUEmbeddingFTRLParametersGradAccumDebug.Options | 
config(String config)
                
               | 
| static LoadTPUEmbeddingFTRLParametersGradAccumDebug | 
create(Scope scope, Operand<TFloat32> parameters, Operand<TFloat32> accumulators, Operand<TFloat32> linears, Operand<TFloat32> gradientAccumulators, Long numShards, Long shardId, Options... options)
                
                   Factory method to create a class wrapping a new LoadTPUEmbeddingFTRLParametersGradAccumDebug operation. | 
| static LoadTPUEmbeddingFTRLParametersGradAccumDebug.Options | 
tableId(Long tableId)
                
               | 
| static LoadTPUEmbeddingFTRLParametersGradAccumDebug.Options | 
tableName(String tableName)
                
               | 
Inherited Methods
Constants
public static final String OP_NAME
The name of this op, as known by TensorFlow core engine
Public Methods
public static LoadTPUEmbeddingFTRLParametersGradAccumDebug create (Scope scope, Operand<TFloat32> parameters, Operand<TFloat32> accumulators, Operand<TFloat32> linears, Operand<TFloat32> gradientAccumulators, Long numShards, Long shardId, Options... options)
Factory method to create a class wrapping a new LoadTPUEmbeddingFTRLParametersGradAccumDebug operation.
Parameters
| scope | current scope | 
|---|---|
| parameters | Value of parameters used in the FTRL optimization algorithm. | 
| accumulators | Value of accumulators used in the FTRL optimization algorithm. | 
| linears | Value of linears used in the FTRL optimization algorithm. | 
| gradientAccumulators | Value of gradient_accumulators used in the FTRL optimization algorithm. | 
| options | carries optional attributes values | 
Returns
- a new instance of LoadTPUEmbeddingFTRLParametersGradAccumDebug