By popular demand, we'll now be offering separate py_grpc_library and
py_proto_library targets sharing the same interface as within google3.
This change necessitated some modifications to how we pull in our own
Python-level dependencies and how we make those available to those
pulling in our project via Bazel.
There is now a grpc_python_deps() Bazel workspace rule that pulls in the
appropriate dependencies, which should be called from the client
project's WORKSPACE file. A test has been added to the bazel/test/
directory to verify that this behavior works as intended.
It's worth noting that the protobuf repository's usage of Starlark
bind() caused a great deal of trouble in ensuring that we could also
pull in six.
This change also required a change in the way generated proto code is
imported in the channelz and health-check modules, as well as in their
associated tests. We were importing them two different ways, each
relative. This resulted in two different module objects being imported
into the process, which were incompatible. I am not sure exactly what
caused this behavior to begin, as this should have been possible before
this PR. As a workaround, I am simply trying two different absolute
imports and using the one that works. This should function both inside
and outside of Bazel environments.
This commit resolves#18331.
This commit resolves#18256.
This commit resolves... another TODO that apparently didn't have an
associated github issue.
We swap out pubref's implementation of py_proto_library with our own,
which more closely mirrors the interface of the internal
py_proto_library, taking the descriptor file output of a proto_library
rule as input.
One minor change in behavior was introduced for simplicity. When a
py_proto_library depends on a proto_library with a source proto file in
a subdirectory of the bazel package, the import module of the resultant
python library will reflect the package, *not* the full directory of the
proto file, including both the bazel package and the subdirectories, as
pubref did previously. This behavior also more closely mirrors google
internal behavior.
This commit also introduces a slightly more stringent bazel format
script. Buildifier on its own will not take care of long lines, but by
running yapf first, we end up with a more legible file. At the moment,
there is no sanity check associated with this formatter.
Building gRPC Python with Bazel has been one of the long requested
additions to gRPC (#8079). Doing so had been made complex by the fact
that Bazel itself is still in active development.
There has been extensive work on building Cython code at tensorflow,
which can be reused for gRPC's purposes as well.
Major included changes required for building grpcio with Bazel are:
- Include Cython as a third party Bazel package, to compile the Cython
parts of gRPC Python.
- Include rules for Python autoconfiguration so Python headers can be
detected by cygrpc.