tfm.optimization.LinearWarmup

View source on GitHub

Linear warmup schedule.

View aliases

Main aliases

tfm.optimization.lr_schedule.LinearWarmup

tfm.optimization.LinearWarmup(
 after_warmup_lr_sched: Union[tf.keras.optimizers.schedules.LearningRateSchedule, float],
 warmup_steps: int,
 warmup_learning_rate: float,
 name: Optional[str] = None
)

Args

after_warmup_lr_sched tf.keras.optimizers.schedules .LearningRateSchedule or a constant.
warmup_steps Number of the warmup steps.
warmup_learning_rate Initial learning rate for the warmup.
name Optional, name of warmup schedule.

Methods

from_config

@classmethod
from_config(
 config
)

Instantiates a LearningRateSchedule from its config.

Args
config Output of get_config().

Returns
A LearningRateSchedule instance.

get_config

View source

get_config() -> Mapping[str, Any]

__call__

View source

__call__(
 step: int
)

Call self as a function.

Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.

Last updated 2024年02月02日 UTC.