Module: tfm.optimization.opt_cfg
Stay organized with collections
Save and categorize content based on your preferences.
Dataclasses for optimizer configs.
Classes
class AdafactorConfig
: Configuration for Adafactor optimizer.
class AdagradConfig
: Configuration for Adagrad optimizer.
class AdamConfig
: Configuration for Adam optimizer.
class AdamExperimentalConfig
: Configuration for experimental Adam optimizer.
class AdamWeightDecayConfig
: Configuration for Adam optimizer with weight decay.
class AdamWeightDecayExperimentalConfig
: Configuration for Adam optimizer with weight decay.
class BaseOptimizerConfig
: Base optimizer config.
class EMAConfig
: Exponential moving average optimizer config.
class LAMBConfig
: Configuration for LAMB optimizer.
class LARSConfig
: Layer-wise adaptive rate scaling config.
class RMSPropConfig
: Configuration for RMSProp optimizer.
class SGDConfig
: Configuration for SGD optimizer.
class SGDExperimentalConfig
: Configuration for SGD optimizer.
class SLIDEConfig
: Configuration for SLIDE optimizer.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.
Last updated 2024-02-02 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-02-02 UTC."],[],[]]