Thanks to visit codestin.com
Credit goes to github.com

Skip to content

ENH: can we have musl-aarch64 prebuilt binary as well? #24934

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
tuananh opened this issue Oct 16, 2023 · 13 comments
Closed

ENH: can we have musl-aarch64 prebuilt binary as well? #24934

tuananh opened this issue Oct 16, 2023 · 13 comments

Comments

@tuananh
Copy link

tuananh commented Oct 16, 2023

Proposed new feature or change:

I'm noticing even with latest release, we dont have musl-aarch64; only amd64

Any technical reason for that?

@andyfaff
Copy link
Member

No technical reasons. However, CI resources for building that target will be a factor. We tend to focus on the most popular targets for pre-built binaries. It's not clear how widespread musllinux_aarch64 is.

@tuananh
Copy link
Author

tuananh commented Oct 16, 2023

multiarch images are catching up the trend. I think the demand for this will be higher. Can i keep this issue open to see the interest in this?

@rgommers
Copy link
Member

I've indeed seen more frequent mentions of multi-arch images, and Docker seems to be making a push for those (see, e.g., this intro blog post). It's not yet clear to me if there's any common groups of platform tags being used together that is emerging - maybe through higher-level build tooling. @tuananh what's the set of platforms you are trying to build for?

@tuananh
Copy link
Author

tuananh commented Oct 17, 2023

I've indeed seen more frequent mentions of multi-arch images, and Docker seems to be making a push for those (see, e.g., this intro blog post). It's not yet clear to me if there's any common groups of platform tags being used together that is emerging - maybe through higher-level build tooling. @tuananh what's the set of platforms you are trying to build for?

We have mix-arch cluster at work and building multiarch images by default (amd64 and arm64).

@rgommers
Copy link
Member

Thanks. And the musl part is a custom choice? Or you're building both with glibc and musl?

@tuananh
Copy link
Author

tuananh commented Oct 17, 2023

Thanks. And the musl part is a custom choice? Or you're building both with glibc and musl?

we use images that use musl (alpine) and some use glibc (wolfi & others) so it would be nice to have both

@andyfaff
Copy link
Member

Just a clearer explanation of the cost:

  • maintaining more wheels incurs extra time and effort required by the developers.
  • native building of musllinux_aarch64 requires use of Cirrus-CI which comes at a monetary cost to the numpy project. It helps if companies are able to provide sponsorship to offset that cost.

@der-eismann
Copy link

Same here, we normally use Alpine-based containers, but our colleagues on Apple Silicon machines can't build these images because of the missing musllinux wheels. Would be great if they could be added, since the difference in image size can be huge (127 MB on python-alpine vs 204 MB on python-slim).

@rgommers
Copy link
Member

Thanks @der-eismann, that use case makes sense. I didn't think about this before, but Apple switching to arm64 makes musllinux more relevant if one thinks about "Docker recipe should run the same across Linux, macOS and Windows machines" kind of requirement (and Windows on Arm may add to that).

I will note that on macOS there should be a workaround right now (from this SO question) - slower but should make things work:

export DOCKER_DEFAULT_PLATFORM=linux/amd64

@der-eismann
Copy link

Hey @rgommers, you're right, that workaround does help a bit, but in theory only for building the image. Actually running numpy workloads in an emulated environment is incredibly slow, we're talking about factor 20-100. So I get that it costs money and everything to push this additional wheel, but it would be highly appreciated πŸ™‚

@rgommers
Copy link
Member

It seems like the case for this is reasonable, and it wouldn't be too hard to do (assuming that there aren't unexpected test failures of course) - WDYT about starting to add these wheels @andyfaff, @charris, @mattip?

@andyfaff
Copy link
Member

Straightforward, I'll get onto it. It'll be on cirrus

@rgommers
Copy link
Member

These are up on https://anaconda.org/scientific-python-nightly-wheels/numpy/files, so I'll close this issue.

Thanks for adding them @andyfaff, and for the input everyone else.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants