- Include new rules from bazelbuild/rules_python, which allow for
importing pip dependencies with Bazel.
- Make Cython BUILD file visible to WORKSPACE.
- Use a custom requirements file for Bazel python. This is added to
allow for requirements previously added only during build steps.
Building gRPC Python with Bazel has been one of the long requested
additions to gRPC (#8079). Doing so had been made complex by the fact
that Bazel itself is still in active development.
There has been extensive work on building Cython code at tensorflow,
which can be reused for gRPC's purposes as well.
Major included changes required for building grpcio with Bazel are:
- Include Cython as a third party Bazel package, to compile the Cython
parts of gRPC Python.
- Include rules for Python autoconfiguration so Python headers can be
detected by cygrpc.
For ios build, bazel build needs to be invoked with:
--ios_multi_cpu=armv7,arm64 build. This in turn gets picked as
cpu based on the docs:
--[no]experimental_enable_objc_cc_deps default: "true"
Allows objc_* rules to depend on cc_library and causes any objc dependencies to be built with --cpu set to "ios_<--ios_cpu>" for any values in --ios_multi_cpu.
This diff explicitly adds darwin config for cpu being ios_x86_64, ios_armv7,
ios_armv7s and ios_arm64. We cant use ios_cpu as condition because its set to x86_64
by default and causes linux build break.
The WORKSPACE file now pulls third_party dependencies directly instead of requiring a git submodule init and update.
The git hashes pulled by Bazel are exactly the same as the git submodules checked into third_party.
A tiny bit of complexity comes from the c-ares project's use of ares_config.h.cmake and ares_build.c.cmake.
The current solution to this problem is to check in a valid ares_config.h and a few valid ares_config.h files.
This is the same general approach used still, but ares_config.h and ares_build.h are supplied to cares.BUILD
from the only remaining local_repository, which serves only to export these two header files.