Continues #35412 addressing feedback in https://github.com/grpc/grpc/pull/35412#issuecomment-1875980755
I'm unable to test the workspace with the newly added py_test, so I'm hoping a CI run here will tell me if the test works.
```console
~/grpc/test/distrib/bazel/python$ bazel --nohome_rc --nosystem_rc test //...
Loading:
Loading:
Loading: 0 packages loaded
Analyzing: 37 targets (0 packages loaded, 0 targets configured)
ERROR: /usr/local/foobar/home/sloretz/.cache/bazel/_bazel_sloretz/7f83b4f00f370e7c52a5cc586445673c/external/com_google_protobuf/upb_generator/BUILD:266:21: @com_google_protobuf//upb_generator:protoc-gen-upb_toolchain: no such attribute 'output_files' in 'proto_lang_toolchain' rule
ERROR: /usr/local/foobar/home/sloretz/.cache/bazel/_bazel_sloretz/7f83b4f00f370e7c52a5cc586445673c/external/com_google_protobuf/upb_generator/BUILD:305:21: @com_google_protobuf//upb_generator:protoc-gen-upb_minitable_toolchain: no such attribute 'output_files' in 'proto_lang_toolchain' rule
ERROR: /usr/local/foobar/home/sloretz/.cache/bazel/_bazel_sloretz/7f83b4f00f370e7c52a5cc586445673c/external/com_google_protobuf/upb_generator/BUILD:338:21: @com_google_protobuf//upb_generator:protoc-gen-upbdefs_toolchain: no such attribute 'output_files' in 'proto_lang_toolchain' rule
ERROR: /usr/local/foobar/home/sloretz/.cache/bazel/_bazel_sloretz/7f83b4f00f370e7c52a5cc586445673c/external/com_google_protobuf/upb_generator/BUILD:305:21: Target '@com_google_protobuf//upb_generator:protoc-gen-upb_minitable_stage1' contains an error and its package is in error and referenced by '@com_google_protobuf//upb_generator:protoc-gen-upb_minitable_toolchain'
ERROR: /usr/local/foobar/home/sloretz/.cache/bazel/_bazel_sloretz/7f83b4f00f370e7c52a5cc586445673c/external/com_github_grpc_grpc/src/proto/grpc/gcp/BUILD:19:14: every rule of type proto_library implicitly depends upon the target '@com_google_protobuf//upb_generator:protoc-gen-upb_minitable_toolchain', but this target could not be found because of: Target '@com_google_protobuf//upb_generator:protoc-gen-upb_minitable_toolchain' contains an error and its package is in error
ERROR: Analysis failed
ERROR: Analysis of target '//namespaced/upper/example:no_import_no_strip_py_pb2_grpc' failed; build aborted:
INFO: Elapsed time: 0.171s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded, 0 targets configured)
ERROR: Couldn't start the build. Unable to run tests
```
Closes#35629
PiperOrigin-RevId: 606695865
The current `py_grpc_library` results in the wrong grpc proto python
code path when grpc is a third-party source code in a Bazel project.
This PR should fix it.
fixes#31011
<!--
If you know who should review your pull request, please assign it to
that
person, otherwise the pull request would get assigned randomly.
If your pull request is for a specific language, please add the
appropriate
lang label.
-->
---------
Co-authored-by: Richard Belleville <rbellevi@google.com>
With some delay, this is a PR for
https://github.com/grpc/grpc/issues/32564 (and previously
https://github.com/grpc/grpc/pull/31791).
I looked into adding a regular `py_test` for this change [as
suggested](https://github.com/grpc/grpc/pull/31791#issuecomment-1423245116)
but I am not aware of any effect that the presence of a .pyi stub file
would have at runtime and where some sort of type-checking in a .py
script would be affected. Stub files are only for use by type checkers &
IDE's. I mean, something like this would work:
```
import helloworld_pb2
py_file = helloworld_pb2.__file__
pyi_file = py_file + 'i’
self.assertTrue(os.path.exists(pyi_file))
```
But that seems really hacky to me. Instead I created a simple rule test
for `py_proto_library` with Bazel Skylib which tests the declared
outputs for an example `py_proto_library` target. Indirectly, this also
tests that the declared output files are actually generated. Please let
me know if this is sufficient.
Change was created by the release automation script. See go/grpc-release
<!--
If you know who should review your pull request, please assign it to
that
person, otherwise the pull request would get assigned randomly.
If your pull request is for a specific language, please add the
appropriate
lang label.
-->
---------
Co-authored-by: Yash Tibrewal <yashkt@google.com>
Co-authored-by: Esun Kim <veblush@google.com>
Co-authored-by: Mark D. Roth <roth@google.com>
Co-authored-by: Yousuk Seung <ysseung@google.com>
Co-authored-by: Craig Tiller <ctiller@google.com>
Co-authored-by: Richard Belleville <rbellevi@google.com>
Co-authored-by: gnossen <gnossen@users.noreply.github.com>
This change moves the logic for computing the package name part of an
import directory from python_rules.bzl to protobuf.bzl, and reinstates
the workspace root part of the directory. The effect is that it is now
possible for a py_library_rule to depend on a proto_library in a
different repository whose source files are generated.
* Fix all lint errors in repo.
* Use strict buildifier by default
* Whoops. That file does not exist
* Attempt fix to buildifier invocation
* Add missing copyright
* Move most py_proto_library macro contents into rule.
The _generate_pb2_src rule is modified to produce a valid PyInfo
provider that can be depended on directly by py_.* rules.
Keeping an intermediate macro in place increases complexity without
adding any real value.
* Move most py_grpc_library macro contents into rule.
The _generate_pb2_grpc_src rule is modified to produce a valid PyInfo
provider that can be depended on directly by py_.* rules.
Keeping an intermediate macro in place increase complexity without
adding any real value.
* Remove support for additional plugins.
It's easy enough for a plugin author to build their own rule-set, and it
adds extra unused (and untested) complexity to gRPC's rules.
* Rewrite py_proto_library to track dependencies correctly.
* Propagate Python dependencies of py_grpc_library rules to consumers.
* Add test coverage for aspect-based py_proto_library.
* WIP
* Add gevent test suite run under Bazel.
* Fix things up
* Yapf
* Fix up Bazel files
* Make py_grpc_test fancier
* Attempt to fix Windows RBE
* Attempt to kick GitHub
* Fix Python 2 runs
* Yet more fixes
* And the patch file too
* I am an idiot
* Mark gevent tests flaky
* Try to make rules_python more tolerant
* Typo
* Exclude reconnect test from gevent
* Remove unnecessary parts of patch
* Buildifier
* You saw nothing
* isort
* Move py_grpc_test to an internal-only file
* Review comments
* More reviewer comments
* Review
All python imports currently are added to the runfiles under __main__
which is only the default when no workspace name is provided. This
change supports both empty workspace names, and specified workspace
names.
This also supports python libs not anchored at root
`proto_library` targets are used as deps for `py_proto_library` and `py_grpc_library` rules. The `proto_library` targets can be configured using `import_prefix` and/or `strip_import_prefix` (which essentially move original location of the proto file and, as result, affects import path within proto files themselves and in the generated language-specific stubs).
The biggest question to answer when generating stubs from moved protos is where to put result (this decision affects all downstream rules as well, because the location of file affects its import path).
This PR tries to follow same logic as the native `cc_proto_library` (created and maintained by Bazel team). For example, if we have `firestore.proto` file, which is located under `google/firestore/v1beta1` Bazel package and want to move it under `google/cloud/firestore_v1beta1/proto` (this is axactly what happens in googleapis repository), it can be configured the following way:
```bzl
proto_library(
name = "firestore_moved_proto",
srcs = ["firestore.proto"],
import_prefix = "google/cloud/firestore_v1beta1/proto",
strip_import_prefix = "google/firestore/v1beta1",
)
```
The rule above will first generate virtual `.proto` files (under new location) and only after that generate a binary descriptor from them.
Specifically it will generate the following "virtual" file under `_virtual_imports` subdirectory of same package:
```
bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/common.proto
```
When supplied to `cc_proto_library`, like the following:
```bzl
cc_proto_library(
name = "firestore_moved_cc_proto",
deps = ["firestore_moved_proto"],
)
```
The rule will generate .cc and .h files like the following:
```
bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/firestore.pb.h
bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/firestore.pb.cc
```
Notice, that each generated `.cc` and `.h` file is prefixed with `_virtual_imports/<name_of_proto_library_target>`.
The python rules try to do same thing, so for the following py_proto_library rule:
```bzl
py_proto_library(
name = "firestore_moved_py_proto",
deps = [":firestore_moved_proto"],
)
```
It wil generate the following file:
```
bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/firestore_pb2.py
```
I.e in same path as cc_proto_library.
This all woks, but an annoying part is that to use the generated library in some other py_library, in needs to specify the "_virtual_imports/proto_target" path as its "includes" parameter (and I can't make it better than that).
Another option would be to skeep the `_virtual_imports/<name_of_proto_library_target>` the suffix for the generated python stubs, which will make the path look like the following:
```
bazel-bin/google/firestore/v1beta1/google/cloud/firestore_v1beta1/proto/firestore_pb2.py
```
That will make using generated stubs simpler and cleaner (no need to specify imports argument), but it will make it inconsistent with the other rules (like cc_proto_library) and also more susseptible to naming conflicts (if there is already something under the generated path).
By popular demand, we'll now be offering separate py_grpc_library and
py_proto_library targets sharing the same interface as within google3.
This change necessitated some modifications to how we pull in our own
Python-level dependencies and how we make those available to those
pulling in our project via Bazel.
There is now a grpc_python_deps() Bazel workspace rule that pulls in the
appropriate dependencies, which should be called from the client
project's WORKSPACE file. A test has been added to the bazel/test/
directory to verify that this behavior works as intended.
It's worth noting that the protobuf repository's usage of Starlark
bind() caused a great deal of trouble in ensuring that we could also
pull in six.
This change also required a change in the way generated proto code is
imported in the channelz and health-check modules, as well as in their
associated tests. We were importing them two different ways, each
relative. This resulted in two different module objects being imported
into the process, which were incompatible. I am not sure exactly what
caused this behavior to begin, as this should have been possible before
this PR. As a workaround, I am simply trying two different absolute
imports and using the one that works. This should function both inside
and outside of Bazel environments.
This commit resolves#18331.
This commit resolves#18256.
This commit resolves... another TODO that apparently didn't have an
associated github issue.
We swap out pubref's implementation of py_proto_library with our own,
which more closely mirrors the interface of the internal
py_proto_library, taking the descriptor file output of a proto_library
rule as input.
One minor change in behavior was introduced for simplicity. When a
py_proto_library depends on a proto_library with a source proto file in
a subdirectory of the bazel package, the import module of the resultant
python library will reflect the package, *not* the full directory of the
proto file, including both the bazel package and the subdirectories, as
pubref did previously. This behavior also more closely mirrors google
internal behavior.
This commit also introduces a slightly more stringent bazel format
script. Buildifier on its own will not take care of long lines, but by
running yapf first, we end up with a more legible file. At the moment,
there is no sanity check associated with this formatter.