Skip to content

Support Python 3.14#2079

Open
dpanici wants to merge 19 commits intomasterfrom
dp/py314
Open

Support Python 3.14#2079
dpanici wants to merge 19 commits intomasterfrom
dp/py314

Conversation

@dpanici
Copy link
Collaborator

@dpanici dpanici commented Feb 4, 2026

Resolves #2070

Nothing major changed. Main things are

  • bumps upper bound for scipy to 1.17.0 , matplotlib to 3.10.8 for numpy to 2.4.2 as those are the newest versions, which also have Python 3.14 support (though note that pip never complained when I installed with an upper bound for numpy which did not support 3.14, which was annoying...)
    • when I installed cleanly with these bounds, the numpy got set to 2.3.5 so that seems to be the cap based on our other dependencies
  • scipy 1.16.1+ deprecates disp for l-bfgs-b which was unused anyways, so had to add a check to pop that from options in that case
  • I ran all of our tests on stellar with a fresh env following the CPU installation and all passed.
  • Adds 3.14 into the unit test rotation
  • Move regression tests to 3.11 to prep for eventual 3.10 deprecation at end of year
  • I have not updated any of the GPU installations on clusters yet as don't want to spend the time repeatedly installing and verifying (and I have no GPU locally), so if anyone happens to fresh install on Python 3.14 and verify, feel free, otherwise I will just leave them as is.

However

So a possible way forward I guess is to just

  • Regen our test images to use matplotlib 3.10.8
  • use matplotlib 3.10.8 in our CI
  • Accept that someone out there who installs DESC on Python 3.14 but uses an old matplotlib version will see it is broken, and hope they will understand that they need to use a newer matplotlib version to fix the changes

@dpanici dpanici requested review from a team, YigitElma, ddudt, f0uriest, rahulgaur104 and unalmis and removed request for a team February 4, 2026 20:39
@github-actions
Copy link
Contributor

github-actions bot commented Feb 4, 2026

Memory benchmark result

|               Test Name                |      %Δ      |    Master (MB)     |      PR (MB)       |    Δ (MB)    |    Time PR (s)     |  Time Master (s)   |
| -------------------------------------- | ------------ | ------------------ | ------------------ | ------------ | ------------------ | ------------------ |
  test_objective_jac_w7x                 |    1.76 %    |     3.813e+03      |     3.880e+03      |    67.07     |       39.76        |       36.95        |
  test_proximal_jac_w7x_with_eq_update   |    2.59 %    |     6.533e+03      |     6.703e+03      |    169.32    |       162.39       |       163.91       |
  test_proximal_freeb_jac                |    0.17 %    |     1.323e+04      |     1.325e+04      |    22.92     |       85.71        |       85.93        |
  test_proximal_freeb_jac_blocked        |   -0.70 %    |     7.553e+03      |     7.500e+03      |    -53.19    |       74.27        |       73.87        |
  test_proximal_freeb_jac_batched        |    0.54 %    |     7.469e+03      |     7.510e+03      |    40.64     |       73.58        |       74.08        |
  test_proximal_jac_ripple               |   -1.80 %    |     3.535e+03      |     3.471e+03      |    -63.73    |       66.67        |       66.98        |
  test_proximal_jac_ripple_bounce1d      |    2.59 %    |     3.540e+03      |     3.632e+03      |    91.80     |       78.32        |       77.92        |
  test_eq_solve                          |   -1.28 %    |     2.041e+03      |     2.015e+03      |    -26.15    |       95.72        |       94.62        |

For the memory plots, go to the summary of Memory Benchmarks workflow and download the artifact.

@dpanici
Copy link
Collaborator Author

dpanici commented Feb 4, 2026

The diff from my pip list on stellar (where the failing test passes) and the one in the action that is failing is:

matplotlib                    3.10.8                     | matplotlib                    3.9.2
pip                           25.3                            | pip                           26.0

Matplotlib does not explicitly say it supports Python 3.14 until 3.10.5, and instaling 3.9.2 on stellar and running the test again confirms it. wonderful, this is not ideal.

pip apparently does not prevent 3.9.2 from installing on Python 3.14 despite there being issues, so how are we supposed to deal with this with simple requirements.txt files?

@YigitElma
Copy link
Collaborator

Not sure why failing as it passes on my stellar install. The diff from my pip list on stellar (where the failing test passes) and the one in the action that is failing is:

For testing on CI, we install matplotlib 3.9.2 which doesn't support Python 3.14, I think that is the reason. I am not sure how to proceed about this though.

@dpanici
Copy link
Collaborator Author

dpanici commented Feb 4, 2026

yea I am not sure either. I tried to google how to deal with this and am just finding no helpful info, but did find an interesting article on version capping lol which argues the opposite of what I feel https://iscinumpy.dev/post/bound-version-constraints/

@YigitElma
Copy link
Collaborator

interesting article on version capping lol which argues the opposite of what I feel

I feel like the chance of next JAX release breaking something is higher than 75%, so we should be good :)

@dpanici
Copy link
Collaborator Author

dpanici commented Feb 5, 2026

Update numpy and scipy to latest versions

YigitElma
YigitElma previously approved these changes Feb 5, 2026
@codecov
Copy link

codecov bot commented Feb 5, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 94.49%. Comparing base (b29e191) to head (b4076d8).
⚠️ Report is 2 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff             @@
##           master    #2079      +/-   ##
==========================================
- Coverage   94.53%   94.49%   -0.04%     
==========================================
  Files         102      102              
  Lines       28785    28787       +2     
==========================================
- Hits        27211    27203       -8     
- Misses       1574     1584      +10     
Files with missing lines Coverage Δ
desc/optimize/_scipy_wrappers.py 90.36% <100.00%> (-2.05%) ⬇️

... and 2 files with indirect coverage changes

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@YigitElma
Copy link
Collaborator

I think the infinite values in master data are due to the grid nodes on the magnetic axis. We compute equilibrium quantities on axis (to make sure the limits are correctly implemented) but some of these quantities are not well-defined and cause the issue. These are the problematic values:

e^theta for desc.equilibrium.equilibrium.Equilibrium is not finite.
e^vartheta for desc.equilibrium.equilibrium.Equilibrium is not finite.
e^theta_PEST for desc.equilibrium.equilibrium.Equilibrium is not finite.
J^theta_PEST for desc.equilibrium.equilibrium.Equilibrium is not finite.
J^vartheta for desc.equilibrium.equilibrium.Equilibrium is not finite.
J^theta for desc.equilibrium.equilibrium.Equilibrium is not finite.
grad(alpha) (periodic) for desc.equilibrium.equilibrium.Equilibrium is not finite.
grad(alpha) for desc.equilibrium.equilibrium.Equilibrium is not finite.
g^aa for desc.equilibrium.equilibrium.Equilibrium is not finite.
g^ra for desc.equilibrium.equilibrium.Equilibrium is not finite.
g^tt for desc.equilibrium.equilibrium.Equilibrium is not finite.
e^theta_r for desc.equilibrium.equilibrium.Equilibrium is not finite.
e^helical for desc.equilibrium.equilibrium.Equilibrium is not finite.
e^theta_rr for desc.equilibrium.equilibrium.Equilibrium is not finite.
|e^helical| for desc.equilibrium.equilibrium.Equilibrium is not finite.
g^rt for desc.equilibrium.equilibrium.Equilibrium is not finite.
g^tt_r for desc.equilibrium.equilibrium.Equilibrium is not finite.
|grad(theta)| for desc.equilibrium.equilibrium.Equilibrium is not finite.
g^rv for desc.equilibrium.equilibrium.Equilibrium is not finite.

@YigitElma
Copy link
Collaborator

YigitElma commented Feb 12, 2026

  • try different initial_trus_radius
  • maybe relax bounds

# since eq grid has nodes on axis, some quantities are infinite
# so just use 1.0 as a scale for the tolerance
print(f"mean(|master data|) of {name} for {p} is not finite.")
mean = 1.0
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we could also instead do

this_data=np.atleast_1d(np.abs(master_data[p][name]))
finite_data = this_data[np.where(np.isfinite(this_data))]
mean = np.nanmean(finite_data)

maybe nanmean not necessary here anymore, but this would keep the mean part. Maybe not needed though, if what you have works

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

atol=1e-8 * mean + 1e-9 so I guess anything is fine

# target force balance residuals with |F| <= 3e3 N
obj = ObjectiveFunction(
ForceBalance(normalize=False, normalize_target=False, bounds=(-1e3, 1e3), eq=eq)
ForceBalance(normalize=False, normalize_target=False, bounds=(-3e3, 3e3), eq=eq)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On my laptop, 1e3 passes, 2e3 fails, 3e3 and higher pass. I guess there are some edge cases for bounds

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For me locally, 1e3 2e3 and 3e3 all pass this test just fine. Seems something very sensitive to the local setup is going on?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did you use the new version of numpy and scipy? I tested this in a new environment with exactly the same as this PR

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh no I did not use the new versions

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

on stellar with Python 3.14 and scipy 1.17 and numpy 2.3.5 all three seem to pass for me. I did not check the rest of the environment for the PR though

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe numpy 2.4.2 has some changes?

YigitElma
YigitElma previously approved these changes Feb 13, 2026
Copy link
Collaborator

@YigitElma YigitElma left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Someone else should also review

@dpanici
Copy link
Collaborator Author

dpanici commented Feb 16, 2026

Add python-version file to use 3.13, setup.py will still have 3.10 until we discontinue support for that at its end of life @dpanici

Copy link
Collaborator Author

@dpanici dpanici left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can't approve as I made the main changes but I'd approve, this looks good to me and hopefully fixes dependabot now

@dpanici dpanici requested a review from YigitElma February 18, 2026 15:19
@YigitElma YigitElma requested a review from f0uriest February 18, 2026 16:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

support Python 3.14

3 participants

Comments