Thanks to visit codestin.com
Credit goes to github.com

Skip to content

BUG: false positive in the heuristic for old Accelerate #25433

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
rgommers opened this issue Dec 19, 2023 · 4 comments · Fixed by #25450
Closed

BUG: false positive in the heuristic for old Accelerate #25433

rgommers opened this issue Dec 19, 2023 · 4 comments · Fixed by #25450
Labels

Comments

@rgommers
Copy link
Member

It was observed in spack/spack#41742 that this error was triggered despite the numpy build not being linked to Accelerate and no other packages being loaded:

RuntimeError: Polyfit sanity test emitted a warning, most likely due to using a buggy Accelerate backend.
If you compiled yourself, more information is available at:
https://numpy.org/doc/stable/user/building.html#accelerated-blas-lapack-libraries
Otherwise report this to the vendor that provided NumPy.
RuntimeWarning: invalid value encountered in divide

A run of the test suite after disabling the _mac_os_check check in numpy/__init__.py confirmed that the numpy install was basically fine, with the only failures of interest being:

FAILED .../numpy/polynomial/tests/test_classes.py::test_bad_conditioned_fit[Polynomial] - AssertionError: assert 'invalid valu...red in divide' == 'The fit may ...y conditioned'
FAILED .../numpy/polynomial/tests/test_classes.py::test_bad_conditioned_fit[Legendre] - AssertionError: assert 'invalid valu...red in divide' == 'The fit may ...y conditioned'
FAILED .../numpy/polynomial/tests/test_classes.py::test_bad_conditioned_fit[Chebyshev] - AssertionError: assert 'invalid valu...red in divide' == 'The fit may ...y conditioned'
FAILED .../numpy/polynomial/tests/test_classes.py::test_bad_conditioned_fit[Laguerre] - AssertionError: assert 'invalid valu...red in divide' == 'The fit may ...y conditioned'
FAILED .../numpy/polynomial/tests/test_classes.py::test_bad_conditioned_fit[Hermite] - AssertionError: assert 'invalid valu...red in divide' == 'The fit may ...y conditioned'
FAILED .../numpy/polynomial/tests/test_classes.py::test_bad_conditioned_fit[HermiteE] - AssertionError: assert 'invalid valu...red in divide' == 'The fit may ...y conditioned'

The numpy.polynomial tests are separate code from the _mac_os_check heuristic, but both do the same thing and rely on a full-rank check after a call to np.linalg.lstsq. This seems to be sensitive to the particular build config in the linked Spack issue, which is:

  • macOS 12.6.6, arm64
  • Apple Clang 13.1.6
  • OpenBLAS 0.3.24

This is a problem, because the heuristic failing makes NumPy unimportable with a misleading error message.

Finally, it's worth pointing out that the lstsq call uses np.errstate to suppress warnings exactly like the one that's making the heuristic fail:

numpy/numpy/linalg/_linalg.py

Lines 2466 to 2468 in 0032ede

with errstate(call=_raise_linalgerror_lstsq, invalid='call',
over='ignore', divide='ignore', under='ignore'):
x, resids, rank, s = gufunc(a, b, rcond, signature=signature)

The question is what to do about this. We should probably at least change the hard error to a warning.

@seberg
Copy link
Member

seberg commented Dec 19, 2023

🤷, not sure if this serves too much purpose anymore. The test is a bit awkward since it isn't super precise (I have no idea what the actual failure is supposed to be).

The error looks a bit more like a problem with division raising spurious FPEs rather than what this was supposed to catch (how come nobody else sees that, though? Do they enable some fastmath?).
OTOH, I am not quite sure, because I am not quite sure how the test worked when it was "successful"...

@rgommers
Copy link
Member Author

IIRC there were problems in older Apple Clang with floating-point errors, so that could explain it. But it could be many things, hard to untangle.

not sure if this serves too much purpose anymore

Agreed. It's not even clear to me what will still break on what macOS versions. I'd say we try to make the test more specific than "any warning" (I think it can be narrowed down to RankWarning, make it as unlikely as possible to pick up old Accelerate by accident, and convert the error to a warning.

@seberg
Copy link
Member

seberg commented Dec 20, 2023

So the original issue seems to have been a RankWarning. Fair, we can narrow the check on that for sure. The main bugs on accelerate were of course vastly worse and not about FPEs, but about blatantly wrong results.
I am not sure we actually have any tests which triggered those very reliably unfortunately.

rgommers added a commit to rgommers/numpy that referenced this issue Dec 21, 2023
This avoids triggering a hard import error on any warning; something
that may be due to other reasons than using old Accelerate (e.g.,
numpygh-25433 is for a specific combination of Clang, macOS and OpenBLAS).

Closes numpygh-25433
@rgommers
Copy link
Member Author

I think it can be narrowed down to RankWarning, make it as unlikely as possible to pick up old Accelerate by accident, and convert the error to a warning.

I kept the error in case a RankWarning gets emitted for now. This is the first time in well over a year that this check failed, so I think it'll be okay for now. In case we do get a new report on a false positive on a RankWarning, then we should probably delete the check completely.

charris pushed a commit to charris/numpy that referenced this issue Dec 22, 2023
This avoids triggering a hard import error on any warning; something
that may be due to other reasons than using old Accelerate (e.g.,
numpygh-25433 is for a specific combination of Clang, macOS and OpenBLAS).

Closes numpygh-25433
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants