Warning: This project is deprecated. TensorFlow Addons has stopped development, The project will only be providing minimal maintenance releases until May 2024. See the full announcement here or on github.

Module: tfa.optimizers

View source on GitHub

Additional optimizers that conform to Keras API.

Classes

class AdaBelief: Variant of the Adam optimizer.

class AdamW: Optimizer that implements the Adam algorithm with weight decay.

class AveragedOptimizerWrapper: Base class for legacy Keras optimizers.

class COCOB: Optimizer that implements COCOB Backprop Algorithm

class ConditionalGradient: Optimizer that implements the Conditional Gradient optimization.

class CyclicalLearningRate: A LearningRateSchedule that uses cyclical schedule.

class DecoupledWeightDecayExtension: This class allows to extend optimizers with decoupled weight decay.

class ExponentialCyclicalLearningRate: A LearningRateSchedule that uses cyclical schedule.

class LAMB: Optimizer that implements the Layer-wise Adaptive Moments (LAMB).

class LazyAdam: Variant of the Adam optimizer that handles sparse updates more efficiently.

class Lookahead: This class allows to extend optimizers with the lookahead mechanism.

class MovingAverage: Optimizer that computes a moving average of the variables.

class MultiOptimizer: Multi Optimizer Wrapper for Discriminative Layer Training.

class NovoGrad: Optimizer that implements NovoGrad.

class ProximalAdagrad: Optimizer that implements the Proximal Adagrad algorithm.

class RectifiedAdam: Variant of the Adam optimizer whose adaptive learning rate is rectified so as to have a consistent variance.

class SGDW: Optimizer that implements the Momentum algorithm with weight_decay.

class SWA: This class extends optimizers with Stochastic Weight Averaging (SWA).

class Triangular2CyclicalLearningRate: A LearningRateSchedule that uses cyclical schedule.

class TriangularCyclicalLearningRate: A LearningRateSchedule that uses cyclical schedule.

class Yogi: Optimizer that implements the Yogi algorithm in Keras.

Functions

extend_with_decoupled_weight_decay(...): Factory function returning an optimizer class with decoupled weight decay.

Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2023年05月25日 UTC.