tfm.optimization.StepCosineDecayWithOffset
Stay organized with collections
Save and categorize content based on your preferences.
Stepwise cosine learning rate decay with offset.
View aliases
Main aliases
tfm.optimization.StepCosineDecayWithOffset(
boundaries,
values,
offset: int = 0,
name: str = 'StepCosineDecayWithOffset'
)
Learning rate is equivalent to one or more cosine decay(s) starting and ending at each interval.
ExampleL
boundaries: [100000, 110000]
values: [1.0, 0.5]
lr_decayed_fn = (
lr_schedule.StepCosineDecayWithOffset(
boundaries,
values))
from 0 to 100000 step, it will cosine decay from 1.0 to 0.5 from 100000 to 110000 step, it cosine decay from 0.5 to 0.0
Args | |
|---|---|
boundaries
|
A list of Tensors or ints with strictly
increasing entries, and with all elements having the same type as the
optimizer step.
|
values
|
A list of Tensors or floats that specifies the
values for the intervals defined by boundaries. It should have one
more element than boundaries, and all elements should have the same
type.
|
offset
|
The offset when computing the power decay. |
name
|
Optional, name of learning rate schedule. |
Methods
from_config
@classmethodfrom_config( config )
Instantiates a LearningRateSchedule from its config.
| Args | |
|---|---|
config
|
Output of get_config().
|
| Returns | |
|---|---|
A LearningRateSchedule instance.
|
get_config
get_config()
__call__
__call__(
global_step
)
Call self as a function.