Using the presence of the `*_pb2_grpc` module, as opposed to the absence
of the build script (`*_commands` module) is a problematic choice,
because even if a generated file is present, the test infrastructure may
want to regenerate it under a different environment (e.g. different
Python/proto package version). This will ensure the protos always get
recompiled if we have a `*_commands` module present, signaling we are in
a build environment, thereby making the process hermetic.
The previous packaging structure exhibited strange behavior of
slowness when trying to use pip to install grpcio-reflection
or grpcio-health-checking in a single line with grpcio-tools.
The root cause seems to be the complicated interaction between
pip and setuptools and the fact that we ship a single .tar.gz
"source" archive for `grpcio_reflection` and
`grpcio_health_checking` packages. `pip` tries to build this
"source" package, and our build process wants to generate
code for the `.proto` files in the package. However, we have
already processed the `.proto` files into `_pb2.py` files in
our artifact build process, and installing `grpcio_tools`
to get `grpcio_{reflection,health_checking}` seems excessive.
The behavior gets worse since `setuptools`, while building
the package from source, tries to fetch `grpcio_tools` from
source and build that too. This takes a while, since it
involves compiling a bunch of native code from `protobuf` and
`grpc` and requires a C compiler to boot.
This commit modifies the Python artifact for the two packages
so that they will not include the raw `.proto` files in the
distribution uploaded to PyPI, nor would they contain the
Python module that does the preprocessing code generation
from the respective .proto files. Instead, a specific code
path is taken when the generated `_pb2_grpc` Python module is
not present in the package to provide such functionality
when built from the gRPC git repository (and hence when built
from our CI infrastructure.)