Using the presence of the `*_pb2_grpc` module, as opposed to the absence
of the build script (`*_commands` module) is a problematic choice,
because even if a generated file is present, the test infrastructure may
want to regenerate it under a different environment (e.g. different
Python/proto package version). This will ensure the protos always get
recompiled if we have a `*_commands` module present, signaling we are in
a build environment, thereby making the process hermetic.
Previously, a secure server is configured with SSL credentials during
initialization, and those credentials will be used for the lifetime of
the server. If the user wants the server to use new credentials, the
user has to restart the server, resulting in server downtime. This
change enables the user to optionally configure the server with a
"certificiate config fetcher," such that on every new client
connection, the server will call the config fetcher before performing
the handshake, allowing the user application to optionally specify new
certificate configuration for the server to use (the fetcher can
return a "no change" and the server continues to use its current
certificate configuration).
The previous packaging structure exhibited strange behavior of
slowness when trying to use pip to install grpcio-reflection
or grpcio-health-checking in a single line with grpcio-tools.
The root cause seems to be the complicated interaction between
pip and setuptools and the fact that we ship a single .tar.gz
"source" archive for `grpcio_reflection` and
`grpcio_health_checking` packages. `pip` tries to build this
"source" package, and our build process wants to generate
code for the `.proto` files in the package. However, we have
already processed the `.proto` files into `_pb2.py` files in
our artifact build process, and installing `grpcio_tools`
to get `grpcio_{reflection,health_checking}` seems excessive.
The behavior gets worse since `setuptools`, while building
the package from source, tries to fetch `grpcio_tools` from
source and build that too. This takes a while, since it
involves compiling a bunch of native code from `protobuf` and
`grpc` and requires a C compiler to boot.
This commit modifies the Python artifact for the two packages
so that they will not include the raw `.proto` files in the
distribution uploaded to PyPI, nor would they contain the
Python module that does the preprocessing code generation
from the respective .proto files. Instead, a specific code
path is taken when the generated `_pb2_grpc` Python module is
not present in the package to provide such functionality
when built from the gRPC git repository (and hence when built
from our CI infrastructure.)
This commit is a rebase of an old contribution with minor formatting
edits (cf. https://github.com/grpc/grpc/pull/8663)
[Original Author]
Masood Malekghassemi <atash@google.com>
Date: Mon Nov 7 14:49:09 2016 -0800
Contributor is a Xoogler and the contribution is owned by Google Inc.
as per the copyright assignment agreement with the original author,
as it was drafted during their employment with Google Inc.
Beta code elements are not generated at all in _pb2_grpc.py files.
This duplicates a lot of the in-test code generation done in
_split_definitions_test. In a future clean-up we may want to
deduplicate the common behavior, put it in a module available to all
other tests, and do all of our testing of generated code with in-test
code generation.
I made this mistake in 2010985ab2 but
only with yesterday's release of six 1.11.0 has it started failing
("TypeError: metaclass conflict: the metaclassof a derived class must
be a (non-strict) subclass of the metaclasses of all its bases").
We were mistaken before when we were testing _pb2.py files being
generated in one directory and _pb2_grpc.py files being generated in
another directory. Sure, that was a thing our code generator could do,
but because of the way paths and packages work in Python it wasn't a
realistic use case for actual users.
This test now tests _pb2.py files and _pb2_grpc.py files being
generated either together or independently of one another, and if
independently, in either order. Looking forward to an eventual
py_grpc_library Bazel rule, that's what actually matters.
- Move it out of the "unit" package, as it's not itself a unit test.
- Suffix the test class with "Test" as we do with every other subclass
of unittest.TestCase.
- Add a larger-than-we'll-need-any-time-soon maxDiff so that failures
are fully described.
- Relax the assertion from assertListEqual to assertSequenceEqual
since we don't actually care whether or not the sequences being
compared are list instances.
- Change the order of the assertions arguments to match the
"<expected>, <actual>" convention used in our assert*Equal calls
elsewhere throughout the test corpus.
- Internal implementation simplifications.
Were this not done these would break when the default behavior of gRPC
Python Protoc Plug-In is changed to be the put-gRPC-code-elements-only-
in-_pb2_grpc.py-files behavior that currently happens only when the
grpc_2_0 flag is passed.
The gRPC Core has two styles for passing metadata: as an integer count
along with a grpc_metadata* pointer, which is used for passing metadata
into the core, and as a grpc_metadata_array, which is used for passing
metadata out of the core. The Cython layer of gRPC Python was using a
single data structure wrapping grpc_metadata_array for both purposes,
but this was complex because the core manages the slices contained in
grpc_metadata_array objects (at least those of which it is aware), so
the Cython layer had to keep track of whether or not the core was aware
of the slices it was using (and it was also defective, leaking slices).
This is solved by realigning with the Cython layer’s intended design of
mirroring as closely as possible in Python the gRPC Core API: we use
one structure for passing metadata into the core (what is now called
cygrpc.Metadata) and second, different structure for receiving metadata
out of the core (what was called cygrpc.Metadata but is now
cygrpc.MetadataArray, reflecting that it wraps the core’s
grpc_metadata_array).
All bug fixes should contain added tests preventing regression but this
doesn't because I don't know at this time how to write a does-not-leak
test for Python that fits well into our existing body of tests. Phooey.
Thanks to Dominik Janků (djanku@email.cz) for investigation and an
earlier draft of a solution.