Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Cross compilation workaround #3332

Unanswered
asimeski-dj asked this question in Q&A
Discussion options

Hi folks,

We are using Bazel to build our application and running into trouble with "cross-compilation" of our python projects. Specifically, we are trying to set up bazel such that we can run the projects on MacOS as well as build Debian-based docker containers locally.

For reference, we are on Bazel version 7.6 and rules_python 1.6.3

I have read through #260 and the related issues, as well as the docs at:

https://rules-python.readthedocs.io/en/latest/howto/multi-platform-pypi-deps.html
https://rules-python.readthedocs.io/en/latest/pypi/download.html#bazel-downloader-and-multi-platform-wheel-hub-repository
https://rules-python.readthedocs.io/en/latest/api/rules_python/python/extensions/pip.html#pip.parse

The issue is, when building containers, the python interpreter is correctly chosen (linux aarch64) but the dependency wheels are built for MacOS.

The command we use to build the container:
bazel build //services/service:service_oci_image --platforms=@toolchains_llvm//platforms:linux-aarch64 --extra_toolchains=@llvm_toolchain_cc//:cc-toolchain-aarch64-linux

Load it:
bazel run //:service_load --platforms=@toolchains_llvm//platforms:linux-aarch64

And finally run it:
docker run service:latest

This fails due to the wrong wheel being chosen (in the container):
find /services/service -type f -name "*_pydantic_core*.so"
/services/service/service_bin.runfiles/rules_python~~pip~pip_312_pydantic_core/site-packages/pydantic_core/_pydantic_core.cpython-312-darwin.so

Here's what we do:

MODULE.bazel

PYTHON_VERSION = "3.12.11"
pip = use_extension("@rules_python//python/extensions:pip.bzl", "pip")
pip.parse(
 hub_name = "pip",
 python_version = PYTHON_VERSION,
 requirements_lock = "//:requirements_lock.txt",
 requirements_by_platform = {
 "//:requirements_lock.txt": "linux_aarch64",
 "//:requirements_lock_darwin.txt": "osx_x86_64",
 },
)
use_repo(pip, "pip")

BUILD.bazel

oci_load(
 name = "service_load",
 image = "//services/service:service_oci_image",
 repo_tags = ["service:latest"],
 visibility = ["//visibility:public"],
)

services/service/BUILD.bazel

py_binary(
 name = "service_bin",
 srcs = ["__main__.py"],
 imports = ["../.."],
 main = "__main__.py",
 visibility = ["//:__subpackages__"],
 deps = [
		...
 "@pip//pydantic",
 ],
)
PACKAGES = [
# 
]
py_oci_image(
 name = "service_oci_image",
 base = "@debian12",
 binary = ":service_bin",
 entrypoint = [
 "/bin/bash",
 "/services/service/service_bin",
 ],
 tars = select(
 {
 "@platforms//cpu:arm64": ["%s/arm64" % package for package in PACKAGES],
 "@platforms//cpu:aarch64": ["%s/arm64" % package for package in PACKAGES],
 },
 ),
 visibility = ["//:__subpackages__"],
)

Our requirements files are compiled on the respective platforms and contain package hashes.

Is there an intermediate solution until #260 is complete that we can use?

Additionally, it would be nice if the solution can play well with gazelle.

Thank you!

You must be logged in to vote

Replies: 2 comments 2 replies

Comment options

pip.parse setting both requirements_lock and requirements_by_platform

I'm not sure this is supported / well-defined behavior? @aignas

The requirements_lock attribute doesn't have a good understanding of multiplatform-ness. I think if there are env markers in the file, it's doable, but if not, then its.

Useful debugging information: bazel query --output=build @pip//... (this will be huge, so please attach as a file). That will show the aliases and selects() used to resolve targets to the backing implementation.

You must be logged in to vote
0 replies
Comment options

The issue that is tracking this work is #260. Unless you use our experimental feature enabled via experimental_index_url, the docker images will not build correctly unless the host platform matches the docker target platform.

The requirements_lock and requirements_by_platform usage together is OK - we will fallback to requirements_lock to provide the requirements for platforms that are not in the requirements_by_platform, so in this particular case it may be noop.

You must be logged in to vote
2 replies
Comment options

Thank you for the response! experimental_index_url seems to work only when there is a prebuilt wheel for the platform, but not when we need to build it ourselves. As an example, pydantic_core has a prebuilt wheel for linux aarch64 so it gets downloaded:

root@6ec877773043:/# find . -name "*pydantic_core*"
...
./services/service/service_bin.runfiles/rules_python~~pip~pip_312_pydantic_core_cp312_cp312_manylinux_2_17_aarch64_4e612061/site-packages/pydantic_core/_pydantic_core.cpython-312-aarch64-linux-gnu.so
...

But when the wheel isn't built for the target platform, it seems to fall back to building it for OSX. For example, the jsonnet package:

root@6ec877773043:/# find . -name "*jsonnet*"
...
./services/service/service_bin.runfiles/rules_python~~pip~pip_312_jsonnet_sdist_7fe2865e/site-packages/_jsonnet.cpython-312-darwin.so
...

This is the current shape of pip.parse:

pip.parse(
 hub_name = "pip",
 python_version = PYTHON_VERSION,
 requirements_by_platform = {
 "//:requirements_lock.txt": "linux_aarch64",
 "//:requirements_lock_darwin.txt": "osx_aarch64,osx_x86_64",
 },
 experimental_index_url = "https://pypi.org/simple",
)

Is there any advice you can offer? Any logs I can provide you with?

Comment options

This feature of cross-building sdists is tracked in #2410. Please see the workarounds documented there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet

AltStyle によって変換されたページ (->オリジナル) /