Module: tfm.optimization.lr_cfg

View source on GitHub

Dataclasses for learning rate schedule config.

Classes

class ConstantLrConfig: Configuration for constant learning rate.

class CosineLrConfig: Configuration for Cosine learning rate decay.

class DirectPowerLrConfig: Configuration for DirectPower learning rate decay.

class ExponentialLrConfig: Configuration for exponential learning rate decay.

class LinearWarmupConfig: Configuration for linear warmup schedule config.

class PolynomialLrConfig: Configuration for polynomial learning rate decay.

class PolynomialWarmupConfig: Configuration for linear warmup schedule config.

class PowerAndLinearDecayLrConfig: Configuration for DirectPower learning rate decay.

class PowerDecayWithOffsetLrConfig: Configuration for power learning rate decay with step offset.

class StepCosineLrConfig: Configuration for stepwise learning rate decay.

class StepwiseLrConfig: Configuration for stepwise learning rate decay.

Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.

Last updated 2024年02月02日 UTC.