Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Add neural network optimizers module to enhance training capabilities #13662

Open
Labels
enhancementThis PR modified some existing files
@Adhithya-Laxman

Description

Feature description

The current machine_learning directory in TheAlgorithms/Python lacks implementations of neural network optimizers, which are fundamental to training deep learning models effectively. To fill this gap and provide educational, reference-quality implementations, I propose creating a new module, neural_network/optimizers, including the following optimizers in sequence:

  • Stochastic Gradient Descent (SGD)
  • Momentum SGD
  • Nesterov Accelerated Gradient (NAG)
  • Adagrad
  • Adam
  • Muon (a recent optimizer using Newton-Schulz orthogonalization)

This order introduces optimizers by increasing complexity and practical usage in the community, facilitating incremental contributions and review. Each optimizer will have well-documented code, clear usage examples, type hints, and comprehensive doctests or unittests.

This multi-step approach ensures maintainable growth of the module and benefits learners by covering the optimizers most commonly used in practice.

Feedback and suggestions on this plan are welcome before implementation begins.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementThis PR modified some existing files

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

      Relationships

      None yet

      Development

      No branches or pull requests

      Issue actions

        AltStyle によって変換されたページ (->オリジナル) /