tfm.optimization.ExponentialDecayWithOffset

View source on GitHub

A LearningRateSchedule that uses an exponential decay schedule.

Inherits From: base_lr_class

View aliases

Main aliases

tfm.optimization.lr_schedule.ExponentialDecayWithOffset

tfm.optimization.ExponentialDecayWithOffset(
 offset=0, **kwargs
)

When training a model, it is often useful to lower the learning rate as the training progresses. This schedule applies an exponential decay function to an optimizer step, given a provided initial learning rate.

The schedule is a 1-arg callable that produces a decayed learning rate when passed the current optimizer step. This can be useful for changing the learning rate value across different invocations of optimizer functions. It is computed as:

defdecayed_learning_rate(step):
 return initial_learning_rate * decay_rate ^ (step / decay_steps)

If the argument staircase is True, then step / decay_steps is an integer division and the decayed learning rate follows a staircase function.

You can pass this schedule directly into a tf.keras.optimizers.Optimizer as the learning rate. Example: When fitting a Keras model, decay every 100000 steps with a base of 0.96:

initial_learning_rate = 0.1
lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(
 initial_learning_rate,
 decay_steps=100000,
 decay_rate=0.96,
 staircase=True)
model.compile(optimizer=tf.keras.optimizers.SGD(learning_rate=lr_schedule),
 loss='sparse_categorical_crossentropy',
 metrics=['accuracy'])
model.fit(data, labels, epochs=5)

The learning rate schedule is also serializable and deserializable using tf.keras.optimizers.schedules.serialize and tf.keras.optimizers.schedules.deserialize.

Returns

A 1-arg callable learning rate schedule that takes the current optimizer step and outputs the decayed learning rate, a scalar Tensor of the same type as initial_learning_rate.

Child Classes

class base_lr_class

Methods

from_config

@classmethod
from_config(
 config
)

Instantiates a LearningRateSchedule from its config.

Args
config Output of get_config().

Returns
A LearningRateSchedule instance.

get_config

get_config()

__call__

View source

__call__(
 step
)

Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates. Some content is licensed under the numpy license.

Last updated 2024年02月02日 UTC.