The C based gRPC (C++, Python, Ruby, Objective-C, PHP, C#) https://grpc.io/
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

294 lines
8.2 KiB

"""Generates and compiles Python gRPC stubs from proto_library rules."""
load(
"//bazel:protobuf.bzl",
[bazel][python] Support _virtual_imports input for py_proto_library and py_grpc_library rules `proto_library` targets are used as deps for `py_proto_library` and `py_grpc_library` rules. The `proto_library` targets can be configured using `import_prefix` and/or `strip_import_prefix` (which essentially move original location of the proto file and, as result, affects import path within proto files themselves and in the generated language-specific stubs). The biggest question to answer when generating stubs from moved protos is where to put result (this decision affects all downstream rules as well, because the location of file affects its import path). This PR tries to follow same logic as the native `cc_proto_library` (created and maintained by Bazel team). For example, if we have `firestore.proto` file, which is located under `google/firestore/v1beta1` Bazel package and want to move it under `google/cloud/firestore_v1beta1/proto` (this is axactly what happens in googleapis repository), it can be configured the following way: ```bzl proto_library( name = "firestore_moved_proto", srcs = ["firestore.proto"], import_prefix = "google/cloud/firestore_v1beta1/proto", strip_import_prefix = "google/firestore/v1beta1", ) ``` The rule above will first generate virtual `.proto` files (under new location) and only after that generate a binary descriptor from them. Specifically it will generate the following "virtual" file under `_virtual_imports` subdirectory of same package: ``` bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/common.proto ``` When supplied to `cc_proto_library`, like the following: ```bzl cc_proto_library( name = "firestore_moved_cc_proto", deps = ["firestore_moved_proto"], ) ``` The rule will generate .cc and .h files like the following: ``` bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/firestore.pb.h bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/firestore.pb.cc ``` Notice, that each generated `.cc` and `.h` file is prefixed with `_virtual_imports/<name_of_proto_library_target>`. The python rules try to do same thing, so for the following py_proto_library rule: ```bzl py_proto_library( name = "firestore_moved_py_proto", deps = [":firestore_moved_proto"], ) ``` It wil generate the following file: ``` bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/firestore_pb2.py ``` I.e in same path as cc_proto_library. This all woks, but an annoying part is that to use the generated library in some other py_library, in needs to specify the "_virtual_imports/proto_target" path as its "includes" parameter (and I can't make it better than that). Another option would be to skeep the `_virtual_imports/<name_of_proto_library_target>` the suffix for the generated python stubs, which will make the path look like the following: ``` bazel-bin/google/firestore/v1beta1/google/cloud/firestore_v1beta1/proto/firestore_pb2.py ``` That will make using generated stubs simpler and cleaner (no need to specify imports argument), but it will make it inconsistent with the other rules (like cc_proto_library) and also more susseptible to naming conflicts (if there is already something under the generated path).
5 years ago
"get_include_directory",
"get_plugin_args",
5 years ago
"protos_from_context",
"includes_from_deps",
"get_proto_arguments",
"declare_out_files",
[bazel][python] Support _virtual_imports input for py_proto_library and py_grpc_library rules `proto_library` targets are used as deps for `py_proto_library` and `py_grpc_library` rules. The `proto_library` targets can be configured using `import_prefix` and/or `strip_import_prefix` (which essentially move original location of the proto file and, as result, affects import path within proto files themselves and in the generated language-specific stubs). The biggest question to answer when generating stubs from moved protos is where to put result (this decision affects all downstream rules as well, because the location of file affects its import path). This PR tries to follow same logic as the native `cc_proto_library` (created and maintained by Bazel team). For example, if we have `firestore.proto` file, which is located under `google/firestore/v1beta1` Bazel package and want to move it under `google/cloud/firestore_v1beta1/proto` (this is axactly what happens in googleapis repository), it can be configured the following way: ```bzl proto_library( name = "firestore_moved_proto", srcs = ["firestore.proto"], import_prefix = "google/cloud/firestore_v1beta1/proto", strip_import_prefix = "google/firestore/v1beta1", ) ``` The rule above will first generate virtual `.proto` files (under new location) and only after that generate a binary descriptor from them. Specifically it will generate the following "virtual" file under `_virtual_imports` subdirectory of same package: ``` bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/common.proto ``` When supplied to `cc_proto_library`, like the following: ```bzl cc_proto_library( name = "firestore_moved_cc_proto", deps = ["firestore_moved_proto"], ) ``` The rule will generate .cc and .h files like the following: ``` bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/firestore.pb.h bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/firestore.pb.cc ``` Notice, that each generated `.cc` and `.h` file is prefixed with `_virtual_imports/<name_of_proto_library_target>`. The python rules try to do same thing, so for the following py_proto_library rule: ```bzl py_proto_library( name = "firestore_moved_py_proto", deps = [":firestore_moved_proto"], ) ``` It wil generate the following file: ``` bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/firestore_pb2.py ``` I.e in same path as cc_proto_library. This all woks, but an annoying part is that to use the generated library in some other py_library, in needs to specify the "_virtual_imports/proto_target" path as its "includes" parameter (and I can't make it better than that). Another option would be to skeep the `_virtual_imports/<name_of_proto_library_target>` the suffix for the generated python stubs, which will make the path look like the following: ``` bazel-bin/google/firestore/v1beta1/google/cloud/firestore_v1beta1/proto/firestore_pb2.py ``` That will make using generated stubs simpler and cleaner (no need to specify imports argument), but it will make it inconsistent with the other rules (like cc_proto_library) and also more susseptible to naming conflicts (if there is already something under the generated path).
5 years ago
"get_out_dir",
)
_GENERATED_PROTO_FORMAT = "{}_pb2.py"
_GENERATED_GRPC_PROTO_FORMAT = "{}_pb2_grpc.py"
def _generate_py_impl(context):
5 years ago
protos = protos_from_context(context)
includes = includes_from_deps(context.attr.deps)
out_files = declare_out_files(protos, context, _GENERATED_PROTO_FORMAT)
tools = [context.executable._protoc]
out_dir = get_out_dir(protos, context)
arguments = ([
"--python_out={}".format(out_dir.path),
[bazel][python] Support _virtual_imports input for py_proto_library and py_grpc_library rules `proto_library` targets are used as deps for `py_proto_library` and `py_grpc_library` rules. The `proto_library` targets can be configured using `import_prefix` and/or `strip_import_prefix` (which essentially move original location of the proto file and, as result, affects import path within proto files themselves and in the generated language-specific stubs). The biggest question to answer when generating stubs from moved protos is where to put result (this decision affects all downstream rules as well, because the location of file affects its import path). This PR tries to follow same logic as the native `cc_proto_library` (created and maintained by Bazel team). For example, if we have `firestore.proto` file, which is located under `google/firestore/v1beta1` Bazel package and want to move it under `google/cloud/firestore_v1beta1/proto` (this is axactly what happens in googleapis repository), it can be configured the following way: ```bzl proto_library( name = "firestore_moved_proto", srcs = ["firestore.proto"], import_prefix = "google/cloud/firestore_v1beta1/proto", strip_import_prefix = "google/firestore/v1beta1", ) ``` The rule above will first generate virtual `.proto` files (under new location) and only after that generate a binary descriptor from them. Specifically it will generate the following "virtual" file under `_virtual_imports` subdirectory of same package: ``` bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/common.proto ``` When supplied to `cc_proto_library`, like the following: ```bzl cc_proto_library( name = "firestore_moved_cc_proto", deps = ["firestore_moved_proto"], ) ``` The rule will generate .cc and .h files like the following: ``` bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/firestore.pb.h bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/firestore.pb.cc ``` Notice, that each generated `.cc` and `.h` file is prefixed with `_virtual_imports/<name_of_proto_library_target>`. The python rules try to do same thing, so for the following py_proto_library rule: ```bzl py_proto_library( name = "firestore_moved_py_proto", deps = [":firestore_moved_proto"], ) ``` It wil generate the following file: ``` bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/firestore_pb2.py ``` I.e in same path as cc_proto_library. This all woks, but an annoying part is that to use the generated library in some other py_library, in needs to specify the "_virtual_imports/proto_target" path as its "includes" parameter (and I can't make it better than that). Another option would be to skeep the `_virtual_imports/<name_of_proto_library_target>` the suffix for the generated python stubs, which will make the path look like the following: ``` bazel-bin/google/firestore/v1beta1/google/cloud/firestore_v1beta1/proto/firestore_pb2.py ``` That will make using generated stubs simpler and cleaner (no need to specify imports argument), but it will make it inconsistent with the other rules (like cc_proto_library) and also more susseptible to naming conflicts (if there is already something under the generated path).
5 years ago
] + [
"--proto_path={}".format(get_include_directory(i))
for i in includes
] + [
"--proto_path={}".format(context.genfiles_dir.path),
])
if context.attr.plugin:
arguments += get_plugin_args(
context.executable.plugin,
[],
out_dir.path,
False,
context.attr.plugin.label.name
)
tools.append(context.executable.plugin)
5 years ago
arguments += get_proto_arguments(protos, context.genfiles_dir.path)
context.actions.run(
inputs = protos + includes,
tools = tools,
outputs = out_files,
executable = context.executable._protoc,
arguments = arguments,
mnemonic = "ProtocInvocation",
)
imports = []
if out_dir.import_path:
imports.append("__main__/%s" % out_dir.import_path)
return [
DefaultInfo(files = depset(direct = out_files)),
PyInfo(
transitive_sources = depset(),
imports = depset(direct = imports),
),
]
_generate_pb2_src = rule(
attrs = {
"deps": attr.label_list(
mandatory = True,
allow_empty = False,
providers = [ProtoInfo],
),
"plugin": attr.label(
mandatory = False,
executable = True,
providers = ["files_to_run"],
cfg = "host",
),
"_protoc": attr.label(
default = Label("//external:protocol_compiler"),
providers = ["files_to_run"],
executable = True,
cfg = "host",
),
},
implementation = _generate_py_impl,
)
def py_proto_library(
name,
deps,
plugin = None,
**kwargs):
"""Generate python code for a protobuf.
Args:
name: The name of the target.
deps: A list of proto_library dependencies. Must contain a single element.
plugin: An optional custom protoc plugin to execute together with
generating the protobuf code.
**kwargs: Additional arguments to be supplied to the invocation of
py_library.
"""
codegen_target = "_{}_codegen".format(name)
if len(deps) != 1:
fail("Can only compile a single proto at a time.")
_generate_pb2_src(
name = codegen_target,
deps = deps,
plugin = plugin,
**kwargs
)
native.py_library(
name = name,
srcs = [":{}".format(codegen_target)],
deps = [
"@com_google_protobuf//:protobuf_python",
":{}".format(codegen_target),
],
**kwargs
)
def _generate_pb2_grpc_src_impl(context):
5 years ago
protos = protos_from_context(context)
includes = includes_from_deps(context.attr.deps)
out_files = declare_out_files(protos, context, _GENERATED_GRPC_PROTO_FORMAT)
plugin_flags = ["grpc_2_0"] + context.attr.strip_prefixes
arguments = []
tools = [context.executable._protoc, context.executable._grpc_plugin]
out_dir = get_out_dir(protos, context)
arguments += get_plugin_args(
context.executable._grpc_plugin,
plugin_flags,
out_dir.path,
False,
)
if context.attr.plugin:
arguments += get_plugin_args(
context.executable.plugin,
[],
out_dir.path,
False,
context.attr.plugin.label.name
)
tools.append(context.executable.plugin)
arguments += [
[bazel][python] Support _virtual_imports input for py_proto_library and py_grpc_library rules `proto_library` targets are used as deps for `py_proto_library` and `py_grpc_library` rules. The `proto_library` targets can be configured using `import_prefix` and/or `strip_import_prefix` (which essentially move original location of the proto file and, as result, affects import path within proto files themselves and in the generated language-specific stubs). The biggest question to answer when generating stubs from moved protos is where to put result (this decision affects all downstream rules as well, because the location of file affects its import path). This PR tries to follow same logic as the native `cc_proto_library` (created and maintained by Bazel team). For example, if we have `firestore.proto` file, which is located under `google/firestore/v1beta1` Bazel package and want to move it under `google/cloud/firestore_v1beta1/proto` (this is axactly what happens in googleapis repository), it can be configured the following way: ```bzl proto_library( name = "firestore_moved_proto", srcs = ["firestore.proto"], import_prefix = "google/cloud/firestore_v1beta1/proto", strip_import_prefix = "google/firestore/v1beta1", ) ``` The rule above will first generate virtual `.proto` files (under new location) and only after that generate a binary descriptor from them. Specifically it will generate the following "virtual" file under `_virtual_imports` subdirectory of same package: ``` bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/common.proto ``` When supplied to `cc_proto_library`, like the following: ```bzl cc_proto_library( name = "firestore_moved_cc_proto", deps = ["firestore_moved_proto"], ) ``` The rule will generate .cc and .h files like the following: ``` bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/firestore.pb.h bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/firestore.pb.cc ``` Notice, that each generated `.cc` and `.h` file is prefixed with `_virtual_imports/<name_of_proto_library_target>`. The python rules try to do same thing, so for the following py_proto_library rule: ```bzl py_proto_library( name = "firestore_moved_py_proto", deps = [":firestore_moved_proto"], ) ``` It wil generate the following file: ``` bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/firestore_pb2.py ``` I.e in same path as cc_proto_library. This all woks, but an annoying part is that to use the generated library in some other py_library, in needs to specify the "_virtual_imports/proto_target" path as its "includes" parameter (and I can't make it better than that). Another option would be to skeep the `_virtual_imports/<name_of_proto_library_target>` the suffix for the generated python stubs, which will make the path look like the following: ``` bazel-bin/google/firestore/v1beta1/google/cloud/firestore_v1beta1/proto/firestore_pb2.py ``` That will make using generated stubs simpler and cleaner (no need to specify imports argument), but it will make it inconsistent with the other rules (like cc_proto_library) and also more susseptible to naming conflicts (if there is already something under the generated path).
5 years ago
"--proto_path={}".format(get_include_directory(i))
for i in includes
]
[bazel][python] Support _virtual_imports input for py_proto_library and py_grpc_library rules `proto_library` targets are used as deps for `py_proto_library` and `py_grpc_library` rules. The `proto_library` targets can be configured using `import_prefix` and/or `strip_import_prefix` (which essentially move original location of the proto file and, as result, affects import path within proto files themselves and in the generated language-specific stubs). The biggest question to answer when generating stubs from moved protos is where to put result (this decision affects all downstream rules as well, because the location of file affects its import path). This PR tries to follow same logic as the native `cc_proto_library` (created and maintained by Bazel team). For example, if we have `firestore.proto` file, which is located under `google/firestore/v1beta1` Bazel package and want to move it under `google/cloud/firestore_v1beta1/proto` (this is axactly what happens in googleapis repository), it can be configured the following way: ```bzl proto_library( name = "firestore_moved_proto", srcs = ["firestore.proto"], import_prefix = "google/cloud/firestore_v1beta1/proto", strip_import_prefix = "google/firestore/v1beta1", ) ``` The rule above will first generate virtual `.proto` files (under new location) and only after that generate a binary descriptor from them. Specifically it will generate the following "virtual" file under `_virtual_imports` subdirectory of same package: ``` bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/common.proto ``` When supplied to `cc_proto_library`, like the following: ```bzl cc_proto_library( name = "firestore_moved_cc_proto", deps = ["firestore_moved_proto"], ) ``` The rule will generate .cc and .h files like the following: ``` bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/firestore.pb.h bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/firestore.pb.cc ``` Notice, that each generated `.cc` and `.h` file is prefixed with `_virtual_imports/<name_of_proto_library_target>`. The python rules try to do same thing, so for the following py_proto_library rule: ```bzl py_proto_library( name = "firestore_moved_py_proto", deps = [":firestore_moved_proto"], ) ``` It wil generate the following file: ``` bazel-bin/google/firestore/v1beta1/_virtual_imports/firestore_moved_proto/google/cloud/firestore_v1beta1/proto/firestore_pb2.py ``` I.e in same path as cc_proto_library. This all woks, but an annoying part is that to use the generated library in some other py_library, in needs to specify the "_virtual_imports/proto_target" path as its "includes" parameter (and I can't make it better than that). Another option would be to skeep the `_virtual_imports/<name_of_proto_library_target>` the suffix for the generated python stubs, which will make the path look like the following: ``` bazel-bin/google/firestore/v1beta1/google/cloud/firestore_v1beta1/proto/firestore_pb2.py ``` That will make using generated stubs simpler and cleaner (no need to specify imports argument), but it will make it inconsistent with the other rules (like cc_proto_library) and also more susseptible to naming conflicts (if there is already something under the generated path).
5 years ago
arguments += ["--proto_path={}".format(context.genfiles_dir.path)]
5 years ago
arguments += get_proto_arguments(protos, context.genfiles_dir.path)
context.actions.run(
inputs = protos + includes,
tools = tools,
outputs = out_files,
executable = context.executable._protoc,
arguments = arguments,
mnemonic = "ProtocInvocation",
)
imports = []
if out_dir.import_path:
imports.append("__main__/%s" % out_dir.import_path)
return [
DefaultInfo(files = depset(direct = out_files)),
PyInfo(
transitive_sources = depset(),
imports = depset(direct = imports),
),
]
_generate_pb2_grpc_src = rule(
attrs = {
"deps": attr.label_list(
mandatory = True,
allow_empty = False,
providers = [ProtoInfo],
),
"strip_prefixes": attr.string_list(),
"plugin": attr.label(
mandatory = False,
executable = True,
providers = ["files_to_run"],
cfg = "host",
),
"_grpc_plugin": attr.label(
executable = True,
providers = ["files_to_run"],
cfg = "host",
default = Label("//src/compiler:grpc_python_plugin"),
),
"_protoc": attr.label(
executable = True,
providers = ["files_to_run"],
cfg = "host",
default = Label("//external:protocol_compiler"),
),
},
implementation = _generate_pb2_grpc_src_impl,
)
def py_grpc_library(
name,
srcs,
deps,
plugin = None,
strip_prefixes = [],
**kwargs):
"""Generate python code for gRPC services defined in a protobuf.
Args:
name: The name of the target.
srcs: (List of `labels`) a single proto_library target containing the
schema of the service.
deps: (List of `labels`) a single py_proto_library target for the
proto_library in `srcs`.
strip_prefixes: (List of `strings`) If provided, this prefix will be
stripped from the beginning of foo_pb2 modules imported by the
generated stubs. This is useful in combination with the `imports`
attribute of the `py_library` rule.
plugin: An optional custom protoc plugin to execute together with
generating the gRPC code.
**kwargs: Additional arguments to be supplied to the invocation of
py_library.
"""
codegen_grpc_target = "_{}_grpc_codegen".format(name)
if len(srcs) != 1:
fail("Can only compile a single proto at a time.")
if len(deps) != 1:
fail("Deps must have length 1.")
_generate_pb2_grpc_src(
name = codegen_grpc_target,
deps = srcs,
strip_prefixes = strip_prefixes,
plugin = plugin,
**kwargs
)
native.py_library(
name = name,
srcs = [
":{}".format(codegen_grpc_target),
],
deps = [
Label("//src/python/grpcio/grpc:grpcio"),
] + deps + [
":{}".format(codegen_grpc_target)
],
**kwargs
)
def py2and3_test(name,
py_test = native.py_test,
**kwargs):
"""Runs a Python test under both Python 2 and Python 3.
Args:
name: The name of the test.
py_test: The rule to use for each test.
5 years ago
**kwargs: Keyword arguments passed directly to the underlying py_test
rule.
"""
if "python_version" in kwargs:
fail("Cannot specify 'python_version' in py2and3_test.")
names = [name + suffix for suffix in (".python2", ".python3")]
python_versions = ["PY2", "PY3"]
for case_name, python_version in zip(names, python_versions):
py_test(
name = case_name,
python_version = python_version,
**kwargs
)
suite_kwargs = {}
if "visibility" in kwargs:
suite_kwargs["visibility"] = kwargs["visibility"]
native.test_suite(
name = name,
tests = names,
**suite_kwargs
)