Module: tfm.optimization.optimizer_factory
Stay organized with collections
Save and categorize content based on your preferences.
Optimizer factory class.
Classes
class OptimizerFactory
: Optimizer factory class.
Functions
register_optimizer_cls(...)
: Register customize optimizer cls.
Other Members |
LEGACY_OPTIMIZERS_CLS
|
{
'adafactor': 'Unimplemented',
'adagrad': <class 'keras.src.optimizers.legacy.adagrad.Adagrad'>,
'adam': <class 'keras.src.optimizers.legacy.adam.Adam'>,
'adam_experimental': <class 'keras.src.optimizers.adam.Adam'>,
'adamw': <class 'official.modeling.optimization.legacy_adamw.AdamWeightDecay'>,
'adamw_experimental': <class 'keras.src.optimizers.adamw.AdamW'>,
'lamb': <class 'official.modeling.optimization.lamb.LAMB'>,
'lars': <class 'official.modeling.optimization.lars.LARS'>,
'rmsprop': <class 'keras.src.optimizers.legacy.rmsprop.RMSprop'>,
'sgd': <class 'keras.src.optimizers.legacy.gradient_descent.SGD'>,
'sgd_experimental': <class 'keras.src.optimizers.sgd.SGD'>,
'slide': 'Unimplemented'
}
|
LR_CLS
|
{
'cosine': <class 'official.modeling.optimization.lr_schedule.CosineDecayWithOffset'>,
'exponential': <class 'official.modeling.optimization.lr_schedule.ExponentialDecayWithOffset'>,
'polynomial': <class 'official.modeling.optimization.lr_schedule.PolynomialDecayWithOffset'>,
'power': <class 'official.modeling.optimization.lr_schedule.DirectPowerDecay'>,
'power_linear': <class 'official.modeling.optimization.lr_schedule.PowerAndLinearDecay'>,
'power_with_offset': <class 'official.modeling.optimization.lr_schedule.PowerDecayWithOffset'>,
'step_cosine_with_offset': <class 'official.modeling.optimization.lr_schedule.StepCosineDecayWithOffset'>,
'stepwise': <class 'official.modeling.optimization.lr_schedule.PiecewiseConstantDecayWithOffset'>
}
|
NEW_OPTIMIZERS_CLS
|
{
'adafactor': 'Unimplemented',
'adagrad': <class 'keras.src.optimizers.adagrad.Adagrad'>,
'adam': <class 'keras.src.optimizers.adam.Adam'>,
'adam_experimental': <class 'keras.src.optimizers.adam.Adam'>,
'adamw': <class 'official.modeling.optimization.legacy_adamw.AdamWeightDecay'>,
'adamw_experimental': <class 'keras.src.optimizers.adamw.AdamW'>,
'lamb': <class 'official.modeling.optimization.lamb.LAMB'>,
'lars': <class 'official.modeling.optimization.lars.LARS'>,
'rmsprop': <class 'keras.src.optimizers.rmsprop.RMSprop'>,
'sgd': <class 'keras.src.optimizers.sgd.SGD'>,
'sgd_experimental': <class 'keras.src.optimizers.sgd.SGD'>,
'slide': 'Unimplemented'
}
|
SHARED_OPTIMIZERS
|
{
'adafactor': 'Unimplemented',
'adam_experimental': <class 'keras.src.optimizers.adam.Adam'>,
'adamw': <class 'official.modeling.optimization.legacy_adamw.AdamWeightDecay'>,
'adamw_experimental': <class 'keras.src.optimizers.adamw.AdamW'>,
'lamb': <class 'official.modeling.optimization.lamb.LAMB'>,
'lars': <class 'official.modeling.optimization.lars.LARS'>,
'sgd_experimental': <class 'keras.src.optimizers.sgd.SGD'>,
'slide': 'Unimplemented'
}
|
WARMUP_CLS
|
{
'linear': <class 'official.modeling.optimization.lr_schedule.LinearWarmup'>,
'polynomial': <class 'official.modeling.optimization.lr_schedule.PolynomialWarmUp'>
}
|
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2024-02-02 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-02-02 UTC."],[],[]]