Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Add optimizers from PySwarms #639

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
spline2hg wants to merge 49 commits into optimagic-dev:main
base: main
Choose a base branch
Loading
from spline2hg:pyswarms-optimizer

Conversation

@spline2hg
Copy link
Collaborator

@spline2hg spline2hg commented Aug 27, 2025

This PR introduces the following optimizers from PySwarms:

  • pyswarms_global_best
  • pyswarms_local_best
  • pyswarms_general

spline2hg added 22 commits August 22, 2025 20:09
Copy link

codecov bot commented Aug 27, 2025
edited
Loading

Codecov Report

❌ Patch coverage is 98.59155% with 4 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
src/optimagic/optimizers/pyswarms_optimizers.py 97.84% 4 Missing ⚠️
Files with missing lines Coverage Δ
src/optimagic/algorithms.py 88.51% <100.00%> (+0.33%) ⬆️
src/optimagic/config.py 100.00% <100.00%> (ø)
src/optimagic/optimization/algo_options.py 100.00% <ø> (ø)
src/optimagic/optimizers/pyswarms_optimizers.py 97.84% <97.84%> (ø)
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Copy link
Collaborator Author

@janosg could you review this?

A few points:

  1. Could you check the failing run on Python 3.11? I’m not sure what’s causing the issue.
  2. PySwarms has stochastic algorithms but doesn’t expose a seed parameter. Because of that, they were being treated as deterministic in the tests. I’ve added a seed parameter with a random number generator for it to be treated as stochastic.
  3. The default value CONVERGENCE_FTOL_REL = 2e-9 is causing the algorithm to converge earlier than expected(how should we handle this?).

Copy link
Member

janosg commented Aug 30, 2025

@janosg could you review this?

A few points:

  1. Could you check the failing run on Python 3.11? I’m not sure what’s causing the issue.
    I also don't see the reason but it seems completely unrelated to your PR so you can ignore it for now.
  1. PySwarms has stochastic algorithms but doesn’t expose a seed parameter. Because of that, they were being treated as deterministic in the tests. I’ve added a seed parameter with a random number generator for it to be treated as stochastic.

If I see correctly, you only use the rng for inital positions. Is this the only stochastic part of the algorithm or are ther other stochastic parts that make the optimization non-deterministic even if a seed is set?

  1. The default value CONVERGENCE_FTOL_REL = 2e-9 is causing the algorithm to converge earlier than expected(how should we handle this?).

You can deviate from this default if there are good reasons. Especially for a global optimizer I don't think it is problematic to set this to 0 because global optimizers are usually expected to run until maxiter or maxfun are reached. Does setting convergence_ftol_rel=0 fix the problem?

Copy link
Member

@janosg janosg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't have time to completely finish the review but this should give you some pointers to continue working on the PR.

Copy link
Collaborator Author

If I see correctly, you only use the rng for inital positions. Is this the only stochastic part of the algorithm or are ther other stochastic parts that make the optimization non-deterministic even if a seed is set?

Stochastic parts in PySwarms:

  1. Random topology and random bounds handling strategy.
  2. Random components (r1, r2) in the velocity update equation, making the movement inherently stochastic.

You can deviate from this default if there are good reasons. Especially for a global optimizer I don't think it is problematic to set this to 0 because global optimizers are usually expected to run until maxiter or maxfun are reached. Does setting convergence_ftol_rel=0 fix the problem?

If we set convergence_ftol_rel to 0, it disables early convergence and forces the algorithm to run for STOPPING_MAXITER iterations by default if stopping_maxiter is not set. However, running optimization for STOPPING_MAXITER times i.e (1_000_000) takes a long time.

Copy link
Member

janosg commented Sep 3, 2025

Stochastic parts in PySwarms:

  1. Random topology and random bounds handling strategy.
  2. Random components (r1, r2) in the velocity update equation, making the movement inherently stochastic.

Then we need a very big warning that the seed does not make the algorithm deterministic. Would setting a global numpy seed help? i.e. np.random.seed(123)? If so, the warning should explain this workaround but we would not want to set a global seed in optimagic.

You can deviate from this default if there are good reasons. Especially for a global optimizer I don't think it is problematic to set this to 0 because global optimizers are usually expected to run until maxiter or maxfun are reached. Does setting convergence_ftol_rel=0 fix the problem?

If we set convergence_ftol_rel to 0, it disables early convergence and forces the algorithm to run for STOPPING_MAXITER iterations by default if stopping_maxiter is not set. However, running optimization for STOPPING_MAXITER times i.e (1_000_000) takes a long time.

As I said, you can deviate from defaults where necessary. Of course, we would not leave maxiter at a million if this is a bad default. Maybe 1000 would be a good idea? We just can't have a variable called "STOPPING_MAXITER_GLOBAL" (analogous to "STOPPING_MAXFUN_GLOBAL") in the algo options because what an iteration is changes between optimizers.

Copy link
Member

janosg commented Sep 3, 2025

The failing test should be fixed after you update the branch.

Copy link
Collaborator Author

spline2hg commented Sep 3, 2025
edited
Loading

Then we need a very big warning that the seed does not make the algorithm deterministic. Would setting a global numpy seed help? i.e. np.random.seed(123)? If so, the warning should explain this workaround but we would not want to set a global seed in optimagic.

I meant that there are stochastic parts to it, but setting the global seed does make it deterministic. Yes setting global seed work. I will add a warning if seed is set.

As I said, you can deviate from defaults where necessary. Of course, we would not leave maxiter at a million if this is a bad default. Maybe 1000 would be a good idea? We just can't have a variable called "STOPPING_MAXITER_GLOBAL" (analogous to "STOPPING_MAXFUN_GLOBAL") in the algo options because what an iteration is changes between optimizers.

Yes, 1000 is a reasonable number for PySwarms, so I had introduced STOPPING_MAXITER_GLOBAL earlier, but I will remove it and just default to 1000 iterations.

Copy link
Member

@janosg janosg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm super sorry for the late review. This looks great and we can merge.

Copy link
Collaborator Author

hi, no worries. it’s been some time, give me some time to check again if everything’s good to merge?

Copy link
Member

janosg commented Oct 7, 2025

Sure! You can click on merge after checking!

spline2hg reacted with thumbs up emoji

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Reviewers

@janosg janosg janosg approved these changes

Assignees

No one assigned

Labels

None yet

Projects

None yet

Milestone

No milestone

Development

Successfully merging this pull request may close these issues.

2 participants

AltStyle によって変換されたページ (->オリジナル) /