tfm.optimization.StepCosineDecayWithOffset

View source on GitHub

Stepwise cosine learning rate decay with offset.

View aliases

Main aliases

tfm.optimization.lr_schedule.StepCosineDecayWithOffset

tfm.optimization.StepCosineDecayWithOffset(
 boundaries,
 values,
 offset: int = 0,
 name: str = 'StepCosineDecayWithOffset'
)

Learning rate is equivalent to one or more cosine decay(s) starting and ending at each interval.

ExampleL

 boundaries: [100000, 110000]
 values: [1.0, 0.5]
 lr_decayed_fn = (
 lr_schedule.StepCosineDecayWithOffset(
 boundaries,
 values))

from 0 to 100000 step, it will cosine decay from 1.0 to 0.5 from 100000 to 110000 step, it cosine decay from 0.5 to 0.0

Args

boundaries A list of Tensors or ints with strictly increasing entries, and with all elements having the same type as the optimizer step.
values A list of Tensors or floats that specifies the values for the intervals defined by boundaries. It should have one more element than boundaries, and all elements should have the same type.
offset The offset when computing the power decay.
name Optional, name of learning rate schedule.

Methods

from_config

@classmethod
from_config(
 config
)

Instantiates a LearningRateSchedule from its config.

Args
config Output of get_config().

Returns
A LearningRateSchedule instance.

get_config

View source

get_config()

__call__

View source

__call__(
 global_step
)

Call self as a function.

Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.

Last updated 2024年02月02日 UTC.