Fix minor typos (#37450)

Just thought I'd contribute some typo fixes I stumbled upon. Nothing controversial (hopefully), just 74 simple fixes.

Use the following command to get a quick and dirty summary of the specific corrections made:
```shell
git diff HEAD^! --word-diff-regex='\w+' -U0 \
  | grep -E '\[\-.*\-\]\{\+.*\+\}' \
  | sed -r 's/.*\[\-(.*)\-\]\{\+(.*)\+\}.*/\1 \2/' \
  | sort | uniq -c | sort -n
```

FWIW, the top typos are:
* satisifed (8)
* uncommited (7)
* tranparent (7)
* expecially (3)
* recieves (3)
* correponding (2)
* slighly (2)
* wierdly (2)

Closes #37450

PiperOrigin-RevId: 665459663
pull/37545/head
Nathan Baulch 3 months ago committed by Copybara-Service
parent 62da6727ae
commit ccd250aa70
  1. 6
      bazel/grpc_build_system.bzl
  2. 4
      bazel/protobuf.bzl
  3. 2
      doc/environment_variables.md
  4. 2
      doc/grpc_xds_bootstrap_format.md
  5. 2
      doc/service_config.md
  6. 6
      doc/xds-test-descriptions.md
  7. 2
      examples/cpp/xds/README.md
  8. 2
      include/grpc/credentials.h
  9. 2
      include/grpc/event_engine/README.md
  10. 2
      include/grpc/event_engine/internal/slice_cast.h
  11. 2
      include/grpc/grpc_crl_provider.h
  12. 2
      include/grpc/impl/channel_arg_names.h
  13. 2
      include/grpcpp/ext/call_metric_recorder.h
  14. 4
      include/grpcpp/security/credentials.h
  15. 2
      include/grpcpp/server_builder.h
  16. 2
      include/grpcpp/support/byte_buffer.h
  17. 2
      include/grpcpp/support/config.h
  18. 2
      include/grpcpp/support/interceptor.h
  19. 4
      templates/tools/doxygen/Doxyfile.base
  20. 4
      templates/tools/doxygen/Doxyfile.objc.include
  21. 18
      tools/bazelify_tests/build_defs.bzl
  22. 12
      tools/bazelify_tests/grpc_repo_archive.sh
  23. 4
      tools/bazelify_tests/test/BUILD
  24. 2
      tools/bazelify_tests/workspace_status_cmd.sh
  25. 4
      tools/buildgen/plugins/transitive_dependencies.py
  26. 2
      tools/codegen/core/gen_huffman_decompressor.cc
  27. 2
      tools/distrib/check_protobuf_pod_version.sh
  28. 2
      tools/distrib/fix_build_deps.py
  29. 2
      tools/distrib/python/grpcio_tools/setup.py
  30. 2
      tools/distrib/python/xds_protos/build.py
  31. 2
      tools/docker_runners/examples/concurrent_bazel.sh
  32. 2
      tools/docker_runners/examples/coredump_in_docker.sh
  33. 2
      tools/docker_runners/examples/gdb_in_docker.sh
  34. 2
      tools/docker_runners/examples/run_tests_c_in_docker.sh
  35. 4
      tools/dockerfile/push_testing_images.sh
  36. 4
      tools/doxygen/Doxyfile.c++
  37. 4
      tools/doxygen/Doxyfile.c++.internal
  38. 4
      tools/doxygen/Doxyfile.core
  39. 4
      tools/doxygen/Doxyfile.core.internal
  40. 4
      tools/doxygen/Doxyfile.objc
  41. 4
      tools/doxygen/Doxyfile.objc.internal
  42. 4
      tools/doxygen/Doxyfile.php
  43. 2
      tools/internal_ci/macos/grpc_ios_binary_size.sh
  44. 2
      tools/run_tests/artifacts/build_artifact_ruby.sh
  45. 2
      tools/run_tests/performance/scenario_config.py
  46. 2
      tools/run_tests/python_utils/upload_rbe_results.py
  47. 2
      tools/run_tests/run_grpclb_interop_tests.py
  48. 2
      tools/run_tests/run_performance_tests.py
  49. 2
      tools/run_tests/run_tests.py
  50. 4
      tools/run_tests/run_xds_tests.py

@ -170,7 +170,7 @@ def grpc_cc_library(
srcs: The source files. srcs: The source files.
public_hdrs: The public headers. public_hdrs: The public headers.
hdrs: The headers. hdrs: The headers.
external_deps: External depdendencies to be resolved. external_deps: External dependencies to be resolved.
defines: Build defines to use. defines: Build defines to use.
deps: cc_library deps. deps: cc_library deps.
select_deps: deps included conditionally. select_deps: deps included conditionally.
@ -529,7 +529,7 @@ def grpc_cc_test(name, srcs = [], deps = [], external_deps = [], args = [], data
timeout: The test timeout. timeout: The test timeout.
tags: The tags for the test. tags: The tags for the test.
exec_compatible_with: A list of constraint values that must be exec_compatible_with: A list of constraint values that must be
satisifed for the platform. satisfied for the platform.
exec_properties: A dictionary of strings that will be added to the exec_properties: A dictionary of strings that will be added to the
exec_properties of a platform selected for this target. exec_properties of a platform selected for this target.
shard_count: The number of shards for this test. shard_count: The number of shards for this test.
@ -660,7 +660,7 @@ def grpc_sh_test(name, srcs = [], args = [], data = [], uses_polling = True, siz
timeout: The test timeout. timeout: The test timeout.
tags: The tags for the test. tags: The tags for the test.
exec_compatible_with: A list of constraint values that must be exec_compatible_with: A list of constraint values that must be
satisifed for the platform. satisfied for the platform.
exec_properties: A dictionary of strings that will be added to the exec_properties: A dictionary of strings that will be added to the
exec_properties of a platform selected for this target. exec_properties of a platform selected for this target.
shard_count: The number of shards for this test. shard_count: The number of shards for this test.

@ -37,13 +37,13 @@ def well_known_proto_libs():
return ["@com_google_protobuf//:" + b for b in _WELL_KNOWN_PROTOS_BASE] return ["@com_google_protobuf//:" + b for b in _WELL_KNOWN_PROTOS_BASE]
def is_well_known(label): def is_well_known(label):
# Bazel surfaces labels as their undelying identity, even if they are referenced # Bazel surfaces labels as their underlying identity, even if they are referenced
# via aliases. Bazel also does not currently provide a way to find the real label # via aliases. Bazel also does not currently provide a way to find the real label
# underlying an alias. So the implementation detail that the WKTs present at the # underlying an alias. So the implementation detail that the WKTs present at the
# top level of the protobuf repo are actually backed by targets in the # top level of the protobuf repo are actually backed by targets in the
# //src/google/protobuf package leaks through here. # //src/google/protobuf package leaks through here.
# We include both the alias path and the underlying path to be resilient to # We include both the alias path and the underlying path to be resilient to
# reversions of this change as well as for continuing compatiblity with repos # reversions of this change as well as for continuing compatibility with repos
# that happen to pull in older versions of protobuf. # that happen to pull in older versions of protobuf.
all_wkt_targets = (["@com_google_protobuf//:" + b for b in _WELL_KNOWN_PROTOS_BASE] + all_wkt_targets = (["@com_google_protobuf//:" + b for b in _WELL_KNOWN_PROTOS_BASE] +
["@com_google_protobuf//src/google/protobuf:" + b for b in _WELL_KNOWN_PROTOS_BASE]) ["@com_google_protobuf//src/google/protobuf:" + b for b in _WELL_KNOWN_PROTOS_BASE])

@ -60,7 +60,7 @@ some configuration as environment variables that can be set.
- NONE - won't log any. - NONE - won't log any.
GRPC_VERBOSITY will set verbosity of absl logging. GRPC_VERBOSITY will set verbosity of absl logging.
- If the external application sets some other verbosity, then whatever is set later will be honoured. - If the external application sets some other verbosity, then whatever is set later will be honoured.
- If nothing is set as GRPC_VERBOSITY, then the setting of the exernal application will be honoured. - If nothing is set as GRPC_VERBOSITY, then the setting of the external application will be honoured.
- If nothing is set by the external application also, the default set by absl will be honoured. - If nothing is set by the external application also, the default set by absl will be honoured.
<!-- END_OPEN_SOURCE_DOCUMENTATION --> <!-- END_OPEN_SOURCE_DOCUMENTATION -->

@ -8,7 +8,7 @@ gRPC expects the xDS bootstrap configuration to be specified as a JSON string.
The xDS bootstrap file location may be specified using the environment variable The xDS bootstrap file location may be specified using the environment variable
`GRPC_XDS_BOOTSTRAP`. Alternatively, the bootstrap file contents may be `GRPC_XDS_BOOTSTRAP`. Alternatively, the bootstrap file contents may be
specified using the environment variable `GRPC_XDS_BOOTSTRAP_CONFIG`. If both specified using the environment variable `GRPC_XDS_BOOTSTRAP_CONFIG`. If both
are specified, the former takes precendence. are specified, the former takes precedence.
The xDS client inside of gRPC parses the bootstrap configuration specified by The xDS client inside of gRPC parses the bootstrap configuration specified by
one of the above means when it is created to configure itself. one of the above means when it is created to configure itself.

@ -48,7 +48,7 @@ name, will return both the resolved addresses and the service config.
The name resolver returns the service config to the gRPC client in JSON form. The name resolver returns the service config to the gRPC client in JSON form.
Individual resolver implementations determine where and in what format the Individual resolver implementations determine where and in what format the
service config is stored. If the resolver implemention obtains the service config is stored. If the resolver implementation obtains the
service config in protobuf form, it must convert it to JSON. service config in protobuf form, it must convert it to JSON.
Alternatively, a resolver implementation may obtain the service config Alternatively, a resolver implementation may obtain the service config
already in JSON form, in which case it may return it directly. Or it already in JSON form, in which case it may return it directly. Or it

@ -640,17 +640,17 @@ There are four sub-tests:
1. `app_timeout_exceeded` 1. `app_timeout_exceeded`
1. Test client configured to send UnaryCall RPCs with a 1s application 1. Test client configured to send UnaryCall RPCs with a 1s application
timeout, and metadata of `rpc-behavior: sleep-2`. timeout, and metadata of `rpc-behavior: sleep-2`.
1. Test driver asserts client recieves ~100% status `DEADLINE_EXCEEDED`. 1. Test driver asserts client receives ~100% status `DEADLINE_EXCEEDED`.
1. `timeout_not_exceeded` 1. `timeout_not_exceeded`
1. Test client configured to send UnaryCall RPCs with the default 1. Test client configured to send UnaryCall RPCs with the default
application timeout (20 seconds), and no metadata. application timeout (20 seconds), and no metadata.
1. Test driver asserts client recieves ~100% status `OK`. 1. Test driver asserts client receives ~100% status `OK`.
1. `timeout_exceeded` (executed with the below test case) 1. `timeout_exceeded` (executed with the below test case)
1. `timeout_different_route` 1. `timeout_different_route`
1. Test client configured to send UnaryCall RPCs and EmptyCall RPCs with 1. Test client configured to send UnaryCall RPCs and EmptyCall RPCs with
the default application timeout (20 seconds), and metadata of the default application timeout (20 seconds), and metadata of
`rpc-behavior: sleep-4`. `rpc-behavior: sleep-4`.
1. Test driver asserts client recieves ~100% status `OK` for EmptyCall 1. Test driver asserts client receives ~100% status `OK` for EmptyCall
and ~100% status `DEADLINE_EXCEEDED` for UnaryCall. and ~100% status `DEADLINE_EXCEEDED` for UnaryCall.
### api_listener ### api_listener

@ -10,7 +10,7 @@ The client takes two command-line arguments -
The server takes three command-line arguments - The server takes three command-line arguments -
* port - Port on which the Hello World service is run. Defaults to 50051. * port - Port on which the Hello World service is run. Defaults to 50051.
* mantenance_port - If secure mode is used (see below), the [Admin](https://github.com/grpc/proposal/blob/master/A38-admin-interface-api.md) service is exposed on this port. If secure mode is not used, `maintenance_port` is unused, and the Admin service is just exposed on `port`. Defaults to 50052. * maintenance_port - If secure mode is used (see below), the [Admin](https://github.com/grpc/proposal/blob/master/A38-admin-interface-api.md) service is exposed on this port. If secure mode is not used, `maintenance_port` is unused, and the Admin service is just exposed on `port`. Defaults to 50052.
* secure - Bool value, defaults to true. When this is set, [XdsServerCredentials](https://github.com/grpc/proposal/blob/master/A29-xds-tls-security.md) will be used with a fallback on `InsecureServerCredentials`. If unset, `InsecureServerCredentials` will be used. * secure - Bool value, defaults to true. When this is set, [XdsServerCredentials](https://github.com/grpc/proposal/blob/master/A29-xds-tls-security.md) will be used with a fallback on `InsecureServerCredentials`. If unset, `InsecureServerCredentials` will be used.
## Running the example ## Running the example

@ -103,7 +103,7 @@ GRPCAPI grpc_call_credentials* grpc_sts_credentials_create(
/** Context that can be used by metadata credentials plugin in order to create /** Context that can be used by metadata credentials plugin in order to create
auth related metadata. */ auth related metadata. */
typedef struct { typedef struct {
/** The fully qualifed service url. */ /** The fully qualified service url. */
const char* service_url; const char* service_url;
/** The method name of the RPC being called (not fully qualified). /** The method name of the RPC being called (not fully qualified).

@ -23,7 +23,7 @@ thresholds. To enable this feature, the memory allocation of read/write buffers
within an EventEngine must be acquired in the form of Slices from within an EventEngine must be acquired in the form of Slices from
SliceAllocators. This is covered more fully in the gRFC and code. SliceAllocators. This is covered more fully in the gRFC and code.
### Documentating expectations around callback execution ### Documenting expectations around callback execution
Some callbacks may be expensive to run. EventEngines should decide on and Some callbacks may be expensive to run. EventEngines should decide on and
document whether callback execution might block polling operations. This way, document whether callback execution might block polling operations. This way,
application developers can plan accordingly (e.g., run their expensive callbacks application developers can plan accordingly (e.g., run their expensive callbacks

@ -24,7 +24,7 @@ namespace internal {
// with `SliceCast`. Both ways need to be declared (i.e. if // with `SliceCast`. Both ways need to be declared (i.e. if
// SliceCastable<A,B> exists, you should declare // SliceCastable<A,B> exists, you should declare
// SliceCastable<B,A> too). // SliceCastable<B,A> too).
// The type has no members, it's just the existance of the specialization that // The type has no members, it's just the existence of the specialization that
// unlocks SliceCast usage for a type pair. // unlocks SliceCast usage for a type pair.
template <typename Result, typename T> template <typename Result, typename T>
struct SliceCastable; struct SliceCastable;

@ -82,7 +82,7 @@ absl::StatusOr<std::shared_ptr<CrlProvider>> CreateDirectoryReloaderCrlProvider(
} // namespace experimental } // namespace experimental
} // namespace grpc_core } // namespace grpc_core
// TODO(gtcooke94) - Mark with api macro when all wrapped langauges support C++ // TODO(gtcooke94) - Mark with api macro when all wrapped languages support C++
// in core APIs // in core APIs
/** /**
* EXPERIMENTAL API - Subject to change * EXPERIMENTAL API - Subject to change

@ -259,7 +259,7 @@
issued by the tcp_write(). By default, this is set to 4. */ issued by the tcp_write(). By default, this is set to 4. */
#define GRPC_ARG_TCP_TX_ZEROCOPY_MAX_SIMULT_SENDS \ #define GRPC_ARG_TCP_TX_ZEROCOPY_MAX_SIMULT_SENDS \
"grpc.experimental.tcp_tx_zerocopy_max_simultaneous_sends" "grpc.experimental.tcp_tx_zerocopy_max_simultaneous_sends"
/* Overrides the TCP socket recieve buffer size, SO_RCVBUF. */ /* Overrides the TCP socket receive buffer size, SO_RCVBUF. */
#define GRPC_ARG_TCP_RECEIVE_BUFFER_SIZE "grpc.tcp_receive_buffer_size" #define GRPC_ARG_TCP_RECEIVE_BUFFER_SIZE "grpc.tcp_receive_buffer_size"
/* Timeout in milliseconds to use for calls to the grpclb load balancer. /* Timeout in milliseconds to use for calls to the grpclb load balancer.
If 0 or unset, the balancer calls will have no deadline. */ If 0 or unset, the balancer calls will have no deadline. */

@ -33,7 +33,7 @@ namespace experimental {
/// Records call metrics for the purpose of load balancing. /// Records call metrics for the purpose of load balancing.
/// During an RPC, call \a ServerContext::ExperimentalGetCallMetricRecorder() /// During an RPC, call \a ServerContext::ExperimentalGetCallMetricRecorder()
/// method to retrive the recorder for the current call. /// method to retrieve the recorder for the current call.
class CallMetricRecorder { class CallMetricRecorder {
public: public:
virtual ~CallMetricRecorder() = default; virtual ~CallMetricRecorder() = default;

@ -247,7 +247,7 @@ class MetadataCredentialsPlugin {
/// Type of credentials this plugin is implementing. /// Type of credentials this plugin is implementing.
virtual const char* GetType() const { return ""; } virtual const char* GetType() const { return ""; }
/// Gets the auth metatada produced by this plugin. /// Gets the auth metadata produced by this plugin.
/// The fully qualified method name is: /// The fully qualified method name is:
/// service_url + "/" + method_name. /// service_url + "/" + method_name.
/// The channel_auth_context contains (among other things), the identity of /// The channel_auth_context contains (among other things), the identity of
@ -295,7 +295,7 @@ grpc::Status StsCredentialsOptionsFromJson(const std::string& json_string,
/// Creates STS credentials options from the $STS_CREDENTIALS environment /// Creates STS credentials options from the $STS_CREDENTIALS environment
/// variable. This environment variable points to the path of a JSON file /// variable. This environment variable points to the path of a JSON file
/// comforming to the schema described above. /// conforming to the schema described above.
grpc::Status StsCredentialsOptionsFromEnv(StsCredentialsOptions* options); grpc::Status StsCredentialsOptionsFromEnv(StsCredentialsOptions* options);
std::shared_ptr<CallCredentials> StsCredentials( std::shared_ptr<CallCredentials> StsCredentials(

@ -297,7 +297,7 @@ class ServerBuilder {
// Creates a passive listener for Server Endpoint injection. // Creates a passive listener for Server Endpoint injection.
/// ///
/// \a PasiveListener lets applications provide pre-established connections /// \a PassiveListener lets applications provide pre-established connections
/// to gRPC Servers. The server will behave as if it accepted the connection /// to gRPC Servers. The server will behave as if it accepted the connection
/// itself on its own listening addresses. /// itself on its own listening addresses.
/// ///

@ -58,7 +58,7 @@ class GrpcByteBufferPeer;
/// A sequence of bytes. /// A sequence of bytes.
class ByteBuffer final { class ByteBuffer final {
public: public:
/// Constuct an empty buffer. /// Construct an empty buffer.
ByteBuffer() : buffer_(nullptr) {} ByteBuffer() : buffer_(nullptr) {}
/// Construct buffer from \a slices, of which there are \a nslices. /// Construct buffer from \a slices, of which there are \a nslices.

@ -35,7 +35,7 @@ namespace grpc {
// Using grpc::string and grpc::to_string is discouraged in favor of // Using grpc::string and grpc::to_string is discouraged in favor of
// std::string and std::to_string. This is only for legacy code using // std::string and std::to_string. This is only for legacy code using
// them explictly. // them explicitly.
using std::string; // deprecated // NOLINT(misc-unused-using-decls) using std::string; // deprecated // NOLINT(misc-unused-using-decls)
using std::to_string; // deprecated // NOLINT(misc-unused-using-decls) using std::to_string; // deprecated // NOLINT(misc-unused-using-decls)

@ -133,7 +133,7 @@ class InterceptorBatchMethods {
/// means that the payload would be available in the serialized form only /// means that the payload would be available in the serialized form only
/// unless an interceptor replaces the payload with ModifySendMessage. /// unless an interceptor replaces the payload with ModifySendMessage.
/// Returns a modifable ByteBuffer holding the serialized form of the message /// Returns a modifiable ByteBuffer holding the serialized form of the message
/// that is going to be sent. Valid for PRE_SEND_MESSAGE interceptions. /// that is going to be sent. Valid for PRE_SEND_MESSAGE interceptions.
/// A return value of nullptr indicates that this ByteBuffer is not valid. /// A return value of nullptr indicates that this ByteBuffer is not valid.
virtual ByteBuffer* GetSerializedSendMessage() = 0; virtual ByteBuffer* GetSerializedSendMessage() = 0;

@ -1407,7 +1407,7 @@ EXT_LINKS_IN_WINDOW = NO
FORMULA_FONTSIZE = 10 FORMULA_FONTSIZE = 10
# Use the FORMULA_TRANPARENT tag to determine whether or not the images # Use the FORMULA_TRANSPARENT tag to determine whether or not the images
# generated for formulas are transparent PNGs. Transparent PNGs are not # generated for formulas are transparent PNGs. Transparent PNGs are not
# supported properly for IE 6.0, but are supported on all modern browsers. # supported properly for IE 6.0, but are supported on all modern browsers.
# #
@ -1982,7 +1982,7 @@ INCLUDE_FILE_PATTERNS =
# recursively expanded use the := operator instead of the = operator. # recursively expanded use the := operator instead of the = operator.
# This tag requires that the tag ENABLE_PREPROCESSING is set to YES. # This tag requires that the tag ENABLE_PREPROCESSING is set to YES.
PREDEFINED = GRPC_FINAL= GRPC_OVERIDE= PREDEFINED = GRPC_FINAL= GRPC_OVERRIDE=
# If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this # If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this
# tag can be used to specify a list of macro names that should be expanded. The # tag can be used to specify a list of macro names that should be expanded. The

@ -1438,7 +1438,7 @@ EXT_LINKS_IN_WINDOW = NO
FORMULA_FONTSIZE = 10 FORMULA_FONTSIZE = 10
# Use the FORMULA_TRANPARENT tag to determine whether or not the images # Use the FORMULA_TRANSPARENT tag to determine whether or not the images
# generated for formulas are transparent PNGs. Transparent PNGs are not # generated for formulas are transparent PNGs. Transparent PNGs are not
# supported properly for IE 6.0, but are supported on all modern browsers. # supported properly for IE 6.0, but are supported on all modern browsers.
# #
@ -2013,7 +2013,7 @@ INCLUDE_FILE_PATTERNS =
# recursively expanded use the := operator instead of the = operator. # recursively expanded use the := operator instead of the = operator.
# This tag requires that the tag ENABLE_PREPROCESSING is set to YES. # This tag requires that the tag ENABLE_PREPROCESSING is set to YES.
PREDEFINED = GRPC_FINAL= GRPC_OVERIDE= PREDEFINED = GRPC_FINAL= GRPC_OVERRIDE=
# If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this # If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this
# tag can be used to specify a list of macro names that should be expanded. The # tag can be used to specify a list of macro names that should be expanded. The

@ -114,7 +114,7 @@ def grpc_run_tests_harness_test(name, args = [], data = [], size = "medium", tim
timeout: The test timeout. timeout: The test timeout.
tags: The tags for the test. tags: The tags for the test.
exec_compatible_with: A list of constraint values that must be exec_compatible_with: A list of constraint values that must be
satisifed for the platform. satisfied for the platform.
flaky: Whether this test is flaky. flaky: Whether this test is flaky.
docker_image_version: The docker .current_version file to use for docker containerization. docker_image_version: The docker .current_version file to use for docker containerization.
use_login_shell: If True, the run_tests.py command will run under a login shell. use_login_shell: If True, the run_tests.py command will run under a login shell.
@ -163,7 +163,7 @@ def grpc_run_bazel_distribtest_test(name, args = [], data = [], size = "medium",
timeout: The test timeout. timeout: The test timeout.
tags: The tags for the test. tags: The tags for the test.
exec_compatible_with: A list of constraint values that must be exec_compatible_with: A list of constraint values that must be
satisifed for the platform. satisfied for the platform.
flaky: Whether this test is flaky. flaky: Whether this test is flaky.
docker_image_version: The docker .current_version file to use for docker containerization. docker_image_version: The docker .current_version file to use for docker containerization.
""" """
@ -193,7 +193,7 @@ def grpc_run_cpp_distribtest_test(name, args = [], data = [], size = "medium", t
timeout: The test timeout. timeout: The test timeout.
tags: The tags for the test. tags: The tags for the test.
exec_compatible_with: A list of constraint values that must be exec_compatible_with: A list of constraint values that must be
satisifed for the platform. satisfied for the platform.
flaky: Whether this test is flaky. flaky: Whether this test is flaky.
docker_image_version: The docker .current_version file to use for docker containerization. docker_image_version: The docker .current_version file to use for docker containerization.
""" """
@ -225,7 +225,7 @@ def grpc_run_simple_command_test(name, args = [], data = [], size = "medium", ti
timeout: The test timeout. timeout: The test timeout.
tags: The tags for the test. tags: The tags for the test.
exec_compatible_with: A list of constraint values that must be exec_compatible_with: A list of constraint values that must be
satisifed for the platform. satisfied for the platform.
flaky: Whether this test is flaky. flaky: Whether this test is flaky.
docker_image_version: The docker .current_version file to use for docker containerization. docker_image_version: The docker .current_version file to use for docker containerization.
""" """
@ -260,10 +260,10 @@ def grpc_build_artifact_task(name, timeout = None, artifact_deps = [], tags = []
artifact_deps: List of dependencies on artifacts built by another grpc_build_artifact_task. artifact_deps: List of dependencies on artifacts built by another grpc_build_artifact_task.
tags: The tags for the target. tags: The tags for the target.
exec_compatible_with: A list of constraint values that must be exec_compatible_with: A list of constraint values that must be
satisifed for the platform. satisfied for the platform.
flaky: Whether this artifact build is flaky. flaky: Whether this artifact build is flaky.
docker_image_version: The docker .current_version file to use for docker containerization. docker_image_version: The docker .current_version file to use for docker containerization.
build_script: The script that builds the aritfacts. build_script: The script that builds the artifacts.
""" """
out_exitcode_file = str(name + "_exit_code") out_exitcode_file = str(name + "_exit_code")
@ -284,7 +284,7 @@ def grpc_build_artifact_task(name, timeout = None, artifact_deps = [], tags = []
cmd = "$(location //tools/bazelify_tests:grpc_build_artifact_task.sh) $(location //tools/bazelify_tests:grpc_repo_archive_with_submodules.tar.gz) $(location " + build_script + ") $(location " + out_exitcode_file + ") $(location " + out_build_log + ") $(location " + out_archive_name + ")" cmd = "$(location //tools/bazelify_tests:grpc_build_artifact_task.sh) $(location //tools/bazelify_tests:grpc_repo_archive_with_submodules.tar.gz) $(location " + build_script + ") $(location " + out_exitcode_file + ") $(location " + out_build_log + ") $(location " + out_archive_name + ")"
# for each artifact task we depends on, use the correponding tar.gz as extra src and pass its location as an extra cmdline arg. # for each artifact task we depends on, use the corresponding tar.gz as extra src and pass its location as an extra cmdline arg.
for dep in artifact_deps: for dep in artifact_deps:
dep_archive_name = str(dep + ".tar.gz") dep_archive_name = str(dep + ".tar.gz")
cmd = cmd + " $(location " + dep_archive_name + ")" cmd = cmd + " $(location " + dep_archive_name + ")"
@ -324,7 +324,7 @@ def grpc_run_distribtest_test(name, artifact_deps = [], size = "medium", timeout
timeout: The test timeout. timeout: The test timeout.
tags: The tags for the test. tags: The tags for the test.
exec_compatible_with: A list of constraint values that must be exec_compatible_with: A list of constraint values that must be
satisifed for the platform. satisfied for the platform.
flaky: Whether this test is flaky. flaky: Whether this test is flaky.
docker_image_version: The docker .current_version file to use for docker containerization. docker_image_version: The docker .current_version file to use for docker containerization.
build_script: The script that runs the test. build_script: The script that runs the test.
@ -341,7 +341,7 @@ def grpc_run_distribtest_test(name, artifact_deps = [], size = "medium", timeout
"$(location " + build_script + ")", "$(location " + build_script + ")",
] ]
# for each artifact task we depends on, use the correponding tar.gz as extra data item and pass its location as an extra arg. # for each artifact task we depends on, use the corresponding tar.gz as extra data item and pass its location as an extra arg.
for dep in artifact_deps: for dep in artifact_deps:
dep_archive_name = str(dep + ".tar.gz") dep_archive_name = str(dep + ".tar.gz")
args.append("$(location " + dep_archive_name + ")") args.append("$(location " + dep_archive_name + ")")

@ -50,7 +50,7 @@ then
exit 1 exit 1
fi fi
GRPC_UNCOMMITED_PATCH_CHECKSUM_FROM_STABLE_STATUS=$(grep ^STABLE_GRPC_UNCOMMITED_PATCH_CHECKSUM bazel-out/stable-status.txt | cut -d' ' -f2) GRPC_UNCOMMITTED_PATCH_CHECKSUM_FROM_STABLE_STATUS=$(grep ^STABLE_GRPC_UNCOMMITTED_PATCH_CHECKSUM bazel-out/stable-status.txt | cut -d' ' -f2)
GRPC_GIT_WORKSPACE_DIRTY_FROM_STABLE_STATUS=$(grep ^STABLE_GRPC_GIT_WORKSPACE_DIRTY bazel-out/stable-status.txt | cut -d' ' -f2) GRPC_GIT_WORKSPACE_DIRTY_FROM_STABLE_STATUS=$(grep ^STABLE_GRPC_GIT_WORKSPACE_DIRTY bazel-out/stable-status.txt | cut -d' ' -f2)
pushd ${ORIGINAL_BAZEL_WORKSPACE_ROOT} >/dev/null pushd ${ORIGINAL_BAZEL_WORKSPACE_ROOT} >/dev/null
@ -65,11 +65,11 @@ fi
mkdir -p ${ARCHIVES_DIR}/grpc mkdir -p ${ARCHIVES_DIR}/grpc
git archive --format="${ARCHIVE_FORMAT}" HEAD >"${ARCHIVES_DIR}/grpc/$(git rev-parse HEAD).${ARCHIVE_FORMAT}" git archive --format="${ARCHIVE_FORMAT}" HEAD >"${ARCHIVES_DIR}/grpc/$(git rev-parse HEAD).${ARCHIVE_FORMAT}"
if [ "${GRPC_UNCOMMITED_PATCH_CHECKSUM_FROM_STABLE_STATUS}" != "" ] if [ "${GRPC_UNCOMMITTED_PATCH_CHECKSUM_FROM_STABLE_STATUS}" != "" ]
then then
git diff HEAD >"${ARCHIVES_DIR}/grpc/grpc_uncommited_${GRPC_UNCOMMITED_PATCH_CHECKSUM_FROM_STABLE_STATUS}.patch" git diff HEAD >"${ARCHIVES_DIR}/grpc/grpc_uncommited_${GRPC_UNCOMMITTED_PATCH_CHECKSUM_FROM_STABLE_STATUS}.patch"
# check that the actual checksum of the patch file is what we expect it to be # check that the actual checksum of the patch file is what we expect it to be
echo "${GRPC_UNCOMMITED_PATCH_CHECKSUM_FROM_STABLE_STATUS} ${ARCHIVES_DIR}/grpc/grpc_uncommited_${GRPC_UNCOMMITED_PATCH_CHECKSUM_FROM_STABLE_STATUS}.patch" | sha256sum --quiet --check echo "${GRPC_UNCOMMITTED_PATCH_CHECKSUM_FROM_STABLE_STATUS} ${ARCHIVES_DIR}/grpc/grpc_uncommited_${GRPC_UNCOMMITTED_PATCH_CHECKSUM_FROM_STABLE_STATUS}.patch" | sha256sum --quiet --check
fi fi
# produce archive for each submodule # produce archive for each submodule
@ -82,10 +82,10 @@ mkdir grpc
tar -xopf "${ARCHIVES_DIR}/grpc/${GRPC_GIT_COMMIT_FROM_STABLE_STATUS}.${ARCHIVE_FORMAT}" -C grpc tar -xopf "${ARCHIVES_DIR}/grpc/${GRPC_GIT_COMMIT_FROM_STABLE_STATUS}.${ARCHIVE_FORMAT}" -C grpc
# apply the patch # apply the patch
if [ "${GRPC_UNCOMMITED_PATCH_CHECKSUM_FROM_STABLE_STATUS}" != "" ] if [ "${GRPC_UNCOMMITTED_PATCH_CHECKSUM_FROM_STABLE_STATUS}" != "" ]
then then
pushd grpc >/dev/null pushd grpc >/dev/null
patch --quiet -p1 <"${ARCHIVES_DIR}/grpc/grpc_uncommited_${GRPC_UNCOMMITED_PATCH_CHECKSUM_FROM_STABLE_STATUS}.patch" patch --quiet -p1 <"${ARCHIVES_DIR}/grpc/grpc_uncommited_${GRPC_UNCOMMITTED_PATCH_CHECKSUM_FROM_STABLE_STATUS}.patch"
popd >/dev/null popd >/dev/null
fi fi

@ -72,7 +72,7 @@ grpc_run_tests_harness_test(
], ],
docker_image_version = "tools/dockerfile/test/ruby_debian11_x64.current_version", docker_image_version = "tools/dockerfile/test/ruby_debian11_x64.current_version",
prepare_script = ":prepare_ruby.sh", prepare_script = ":prepare_ruby.sh",
use_login_shell = True, # ruby's docker image uses RVM which wierdly requires login shell use_login_shell = True, # ruby's docker image uses RVM which weirdly requires login shell
) )
grpc_run_tests_harness_test( grpc_run_tests_harness_test(
@ -83,7 +83,7 @@ grpc_run_tests_harness_test(
], ],
docker_image_version = "tools/dockerfile/test/ruby_debian11_x64.current_version", docker_image_version = "tools/dockerfile/test/ruby_debian11_x64.current_version",
prepare_script = ":prepare_ruby.sh", prepare_script = ":prepare_ruby.sh",
use_login_shell = True, # ruby's docker image uses RVM which wierdly requires login shell use_login_shell = True, # ruby's docker image uses RVM which weirdly requires login shell
) )
# PHP # PHP

@ -28,7 +28,7 @@ then
echo "STABLE_GRPC_GIT_WORKSPACE_DIRTY false" echo "STABLE_GRPC_GIT_WORKSPACE_DIRTY false"
else else
echo "STABLE_GRPC_GIT_WORKSPACE_DIRTY true" echo "STABLE_GRPC_GIT_WORKSPACE_DIRTY true"
echo "STABLE_GRPC_UNCOMMITED_PATCH_CHECKSUM $(git diff HEAD | sha256sum | cut -f1 -d' ')" echo "STABLE_GRPC_UNCOMMITTED_PATCH_CHECKSUM $(git diff HEAD | sha256sum | cut -f1 -d' ')"
fi fi
# Since only --workspace_status_command is allowed by bazel, also include # Since only --workspace_status_command is allowed by bazel, also include

@ -37,9 +37,9 @@ def transitive_deps(lib_map, node):
if next_node: if next_node:
recursive_helper(next_node) recursive_helper(next_node)
else: else:
# For some deps, the corrensponding library entry doesn't exist, # For some deps, the corresponding library entry doesn't exist,
# but we still want to preserve the dependency so that the build # but we still want to preserve the dependency so that the build
# system can provide custom handling for that depdendency. # system can provide custom handling for that dependency.
result.append(dep) result.append(dep)
if node is not start: if node is not start:
result.insert(0, node["name"]) result.insert(0, node["name"])

@ -148,7 +148,7 @@ SymSet AllSyms() {
return syms; return syms;
} }
// What whould we do after reading a set of bits? // What would we do after reading a set of bits?
struct ReadActions { struct ReadActions {
// Emit these symbols // Emit these symbols
std::vector<int> emit; std::vector<int> emit;

@ -23,7 +23,7 @@ pushd third_party/protobuf
version1=$(git describe --tags | cut -f 1 -d'-') version1=$(git describe --tags | cut -f 1 -d'-')
v1=${version1:1} v1=${version1:1}
# Protobuf has recently changed the versioning of the release branches/tags # Protobuf has recently changed the versioning of the release branches/tags
# and the same release commit can be tagged with mutliple tag names # and the same release commit can be tagged with multiple tag names
# (e.g. v3.21.12 is also tagged as v21.12), which ultimately confuses # (e.g. v3.21.12 is also tagged as v21.12), which ultimately confuses
# the output of "git describe --tags" and makes it non-deterministic. # the output of "git describe --tags" and makes it non-deterministic.
# The hack below converts the version number to always become 3.x.y # The hack below converts the version number to always become 3.x.y

@ -488,7 +488,7 @@ if args.whats_left:
) )
# Keeps track of all possible sets of dependencies that could satify the # Keeps track of all possible sets of dependencies that could satisfy the
# problem. (models the list monad in Haskell!) # problem. (models the list monad in Haskell!)
class Choices: class Choices:
def __init__(self, library, substitutions): def __init__(self, library, substitutions):

@ -133,7 +133,7 @@ class BuildExt(build_ext.build_ext):
# behavior in gcc and clang. The clang doesn't take --stdc++11 # behavior in gcc and clang. The clang doesn't take --stdc++11
# flags but gcc does. Since the setuptools of Python only support # flags but gcc does. Since the setuptools of Python only support
# all C or all C++ compilation, the mix of C and C++ will crash. # all C or all C++ compilation, the mix of C and C++ will crash.
# *By default*, macOS and FreBSD use clang and Linux use gcc # *By default*, macOS and FreeBSD use clang and Linux use gcc
# #
# If we are not using a permissive compiler that's OK with being # If we are not using a permissive compiler that's OK with being
# passed wrong std flags, swap out compile function by adding a filter # passed wrong std flags, swap out compile function by adding a filter

@ -56,7 +56,7 @@ EXCLUDE_PROTO_PACKAGES_LIST = tuple(
) )
) )
# Compute the pathes # Compute the paths
WORK_DIR = os.path.dirname(os.path.abspath(__file__)) WORK_DIR = os.path.dirname(os.path.abspath(__file__))
GRPC_ROOT = os.path.abspath(os.path.join(WORK_DIR, "..", "..", "..", "..")) GRPC_ROOT = os.path.abspath(os.path.join(WORK_DIR, "..", "..", "..", ".."))
ENVOY_API_PROTO_ROOT = os.path.join(GRPC_ROOT, "third_party", "envoy-api") ENVOY_API_PROTO_ROOT = os.path.join(GRPC_ROOT, "third_party", "envoy-api")

@ -34,7 +34,7 @@ cd "$(dirname "$0")/../../.."
# use the default docker image used for bazel builds # use the default docker image used for bazel builds
export DOCKERFILE_DIR=tools/dockerfile/test/bazel export DOCKERFILE_DIR=tools/dockerfile/test/bazel
# Bazel RBE uses application default credentials from localhost to authenticate with RBE servers. Use a trick to make the credentails accessible from inside the docker container." # Bazel RBE uses application default credentials from localhost to authenticate with RBE servers. Use a trick to make the credentials accessible from inside the docker container."
APPLICATION_DEFAULT_CREDENTIALS_DIR="$HOME/.config/gcloud" APPLICATION_DEFAULT_CREDENTIALS_DIR="$HOME/.config/gcloud"
export DOCKER_EXTRA_ARGS="-v=${APPLICATION_DEFAULT_CREDENTIALS_DIR}:/application_default_credentials:ro -e=GOOGLE_APPLICATION_CREDENTIALS=/application_default_credentials/application_default_credentials.json" export DOCKER_EXTRA_ARGS="-v=${APPLICATION_DEFAULT_CREDENTIALS_DIR}:/application_default_credentials:ro -e=GOOGLE_APPLICATION_CREDENTIALS=/application_default_credentials/application_default_credentials.json"

@ -22,7 +22,7 @@ cd "$(dirname "$0")/../../.."
# To use the correct docker image for your experiments, # To use the correct docker image for your experiments,
# note that every invocation of run_tests.py with "--use_docker" # note that every invocation of run_tests.py with "--use_docker"
# prints the docker image used as a debug message at the end of the run. # prints the docker image used as a debug message at the end of the run.
# This is expecially important when --compiler/--arch params are # This is especially important when --compiler/--arch params are
# use, since they usually influence with docker image will be used # use, since they usually influence with docker image will be used
# by run_tests.py # by run_tests.py
export DOCKERFILE_DIR=tools/dockerfile/test/cxx_debian11_x64 export DOCKERFILE_DIR=tools/dockerfile/test/cxx_debian11_x64

@ -22,7 +22,7 @@ cd "$(dirname "$0")/../../.."
# To use the correct docker image for your experiments, # To use the correct docker image for your experiments,
# note that every invocation of run_tests.py with "--use_docker" # note that every invocation of run_tests.py with "--use_docker"
# prints the docker image used as a debug message at the end of the run. # prints the docker image used as a debug message at the end of the run.
# This is expecially important when --compiler/--arch params are # This is especially important when --compiler/--arch params are
# use, since they usually influence with docker image will be used # use, since they usually influence with docker image will be used
# by run_tests.py # by run_tests.py
export DOCKERFILE_DIR=tools/dockerfile/test/cxx_debian11_x64 export DOCKERFILE_DIR=tools/dockerfile/test/cxx_debian11_x64

@ -22,7 +22,7 @@ cd "$(dirname "$0")/../../.."
# To use the correct docker image for your experiments, # To use the correct docker image for your experiments,
# note that every invocation of run_tests.py with "--use_docker" # note that every invocation of run_tests.py with "--use_docker"
# prints the docker image used as a debug message at the end of the run. # prints the docker image used as a debug message at the end of the run.
# This is expecially important when --compiler/--arch params are # This is especially important when --compiler/--arch params are
# use, since they usually influence with docker image will be used # use, since they usually influence with docker image will be used
# by run_tests.py # by run_tests.py
export DOCKERFILE_DIR=tools/dockerfile/test/cxx_debian11_x64 export DOCKERFILE_DIR=tools/dockerfile/test/cxx_debian11_x64

@ -37,7 +37,7 @@ cd -
# $ gcloud auth login # $ gcloud auth login
# Various check that the environment is setup correctly. # Various check that the environment is setup correctly.
# The enviroment checks are skipped when running as a sanity check on CI. # The environment checks are skipped when running as a sanity check on CI.
if [ "${CHECK_MODE}" == "" ] if [ "${CHECK_MODE}" == "" ]
then then
# Check that docker is installed and sudoless docker works. # Check that docker is installed and sudoless docker works.
@ -178,7 +178,7 @@ do
then then
echo "Running 'docker build' for ${DOCKER_IMAGE_NAME}" echo "Running 'docker build' for ${DOCKER_IMAGE_NAME}"
echo "==========" echo "=========="
# Buliding a docker image with two tags; # Building a docker image with two tags;
# - one for image identification based on Dockerfile hash # - one for image identification based on Dockerfile hash
# - one to exclude it from the GCP Vulnerability Scanner # - one to exclude it from the GCP Vulnerability Scanner
docker build \ docker build \

@ -1737,7 +1737,7 @@ EXT_LINKS_IN_WINDOW = NO
FORMULA_FONTSIZE = 10 FORMULA_FONTSIZE = 10
# Use the FORMULA_TRANPARENT tag to determine whether or not the images # Use the FORMULA_TRANSPARENT tag to determine whether or not the images
# generated for formulas are transparent PNGs. Transparent PNGs are not # generated for formulas are transparent PNGs. Transparent PNGs are not
# supported properly for IE 6.0, but are supported on all modern browsers. # supported properly for IE 6.0, but are supported on all modern browsers.
# #
@ -2312,7 +2312,7 @@ INCLUDE_FILE_PATTERNS =
# recursively expanded use the := operator instead of the = operator. # recursively expanded use the := operator instead of the = operator.
# This tag requires that the tag ENABLE_PREPROCESSING is set to YES. # This tag requires that the tag ENABLE_PREPROCESSING is set to YES.
PREDEFINED = GRPC_FINAL= GRPC_OVERIDE= PREDEFINED = GRPC_FINAL= GRPC_OVERRIDE=
# If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this # If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this
# tag can be used to specify a list of macro names that should be expanded. The # tag can be used to specify a list of macro names that should be expanded. The

@ -3744,7 +3744,7 @@ EXT_LINKS_IN_WINDOW = NO
FORMULA_FONTSIZE = 10 FORMULA_FONTSIZE = 10
# Use the FORMULA_TRANPARENT tag to determine whether or not the images # Use the FORMULA_TRANSPARENT tag to determine whether or not the images
# generated for formulas are transparent PNGs. Transparent PNGs are not # generated for formulas are transparent PNGs. Transparent PNGs are not
# supported properly for IE 6.0, but are supported on all modern browsers. # supported properly for IE 6.0, but are supported on all modern browsers.
# #
@ -4319,7 +4319,7 @@ INCLUDE_FILE_PATTERNS =
# recursively expanded use the := operator instead of the = operator. # recursively expanded use the := operator instead of the = operator.
# This tag requires that the tag ENABLE_PREPROCESSING is set to YES. # This tag requires that the tag ENABLE_PREPROCESSING is set to YES.
PREDEFINED = GRPC_FINAL= GRPC_OVERIDE= PREDEFINED = GRPC_FINAL= GRPC_OVERRIDE=
# If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this # If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this
# tag can be used to specify a list of macro names that should be expanded. The # tag can be used to specify a list of macro names that should be expanded. The

@ -1535,7 +1535,7 @@ EXT_LINKS_IN_WINDOW = NO
FORMULA_FONTSIZE = 10 FORMULA_FONTSIZE = 10
# Use the FORMULA_TRANPARENT tag to determine whether or not the images # Use the FORMULA_TRANSPARENT tag to determine whether or not the images
# generated for formulas are transparent PNGs. Transparent PNGs are not # generated for formulas are transparent PNGs. Transparent PNGs are not
# supported properly for IE 6.0, but are supported on all modern browsers. # supported properly for IE 6.0, but are supported on all modern browsers.
# #
@ -2110,7 +2110,7 @@ INCLUDE_FILE_PATTERNS =
# recursively expanded use the := operator instead of the = operator. # recursively expanded use the := operator instead of the = operator.
# This tag requires that the tag ENABLE_PREPROCESSING is set to YES. # This tag requires that the tag ENABLE_PREPROCESSING is set to YES.
PREDEFINED = GRPC_FINAL= GRPC_OVERIDE= PREDEFINED = GRPC_FINAL= GRPC_OVERRIDE=
# If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this # If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this
# tag can be used to specify a list of macro names that should be expanded. The # tag can be used to specify a list of macro names that should be expanded. The

@ -3460,7 +3460,7 @@ EXT_LINKS_IN_WINDOW = NO
FORMULA_FONTSIZE = 10 FORMULA_FONTSIZE = 10
# Use the FORMULA_TRANPARENT tag to determine whether or not the images # Use the FORMULA_TRANSPARENT tag to determine whether or not the images
# generated for formulas are transparent PNGs. Transparent PNGs are not # generated for formulas are transparent PNGs. Transparent PNGs are not
# supported properly for IE 6.0, but are supported on all modern browsers. # supported properly for IE 6.0, but are supported on all modern browsers.
# #
@ -4035,7 +4035,7 @@ INCLUDE_FILE_PATTERNS =
# recursively expanded use the := operator instead of the = operator. # recursively expanded use the := operator instead of the = operator.
# This tag requires that the tag ENABLE_PREPROCESSING is set to YES. # This tag requires that the tag ENABLE_PREPROCESSING is set to YES.
PREDEFINED = GRPC_FINAL= GRPC_OVERIDE= PREDEFINED = GRPC_FINAL= GRPC_OVERRIDE=
# If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this # If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this
# tag can be used to specify a list of macro names that should be expanded. The # tag can be used to specify a list of macro names that should be expanded. The

@ -1486,7 +1486,7 @@ EXT_LINKS_IN_WINDOW = NO
FORMULA_FONTSIZE = 10 FORMULA_FONTSIZE = 10
# Use the FORMULA_TRANPARENT tag to determine whether or not the images # Use the FORMULA_TRANSPARENT tag to determine whether or not the images
# generated for formulas are transparent PNGs. Transparent PNGs are not # generated for formulas are transparent PNGs. Transparent PNGs are not
# supported properly for IE 6.0, but are supported on all modern browsers. # supported properly for IE 6.0, but are supported on all modern browsers.
# #
@ -2061,7 +2061,7 @@ INCLUDE_FILE_PATTERNS =
# recursively expanded use the := operator instead of the = operator. # recursively expanded use the := operator instead of the = operator.
# This tag requires that the tag ENABLE_PREPROCESSING is set to YES. # This tag requires that the tag ENABLE_PREPROCESSING is set to YES.
PREDEFINED = GRPC_FINAL= GRPC_OVERIDE= PREDEFINED = GRPC_FINAL= GRPC_OVERRIDE=
# If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this # If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this
# tag can be used to specify a list of macro names that should be expanded. The # tag can be used to specify a list of macro names that should be expanded. The

@ -1513,7 +1513,7 @@ EXT_LINKS_IN_WINDOW = NO
FORMULA_FONTSIZE = 10 FORMULA_FONTSIZE = 10
# Use the FORMULA_TRANPARENT tag to determine whether or not the images # Use the FORMULA_TRANSPARENT tag to determine whether or not the images
# generated for formulas are transparent PNGs. Transparent PNGs are not # generated for formulas are transparent PNGs. Transparent PNGs are not
# supported properly for IE 6.0, but are supported on all modern browsers. # supported properly for IE 6.0, but are supported on all modern browsers.
# #
@ -2088,7 +2088,7 @@ INCLUDE_FILE_PATTERNS =
# recursively expanded use the := operator instead of the = operator. # recursively expanded use the := operator instead of the = operator.
# This tag requires that the tag ENABLE_PREPROCESSING is set to YES. # This tag requires that the tag ENABLE_PREPROCESSING is set to YES.
PREDEFINED = GRPC_FINAL= GRPC_OVERIDE= PREDEFINED = GRPC_FINAL= GRPC_OVERRIDE=
# If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this # If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this
# tag can be used to specify a list of macro names that should be expanded. The # tag can be used to specify a list of macro names that should be expanded. The

@ -1473,7 +1473,7 @@ EXT_LINKS_IN_WINDOW = NO
FORMULA_FONTSIZE = 10 FORMULA_FONTSIZE = 10
# Use the FORMULA_TRANPARENT tag to determine whether or not the images # Use the FORMULA_TRANSPARENT tag to determine whether or not the images
# generated for formulas are transparent PNGs. Transparent PNGs are not # generated for formulas are transparent PNGs. Transparent PNGs are not
# supported properly for IE 6.0, but are supported on all modern browsers. # supported properly for IE 6.0, but are supported on all modern browsers.
# #
@ -2048,7 +2048,7 @@ INCLUDE_FILE_PATTERNS =
# recursively expanded use the := operator instead of the = operator. # recursively expanded use the := operator instead of the = operator.
# This tag requires that the tag ENABLE_PREPROCESSING is set to YES. # This tag requires that the tag ENABLE_PREPROCESSING is set to YES.
PREDEFINED = GRPC_FINAL= GRPC_OVERIDE= PREDEFINED = GRPC_FINAL= GRPC_OVERRIDE=
# If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this # If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then this
# tag can be used to specify a list of macro names that should be expanded. The # tag can be used to specify a list of macro names that should be expanded. The

@ -32,7 +32,7 @@ then
# running on PR, generate size diff # running on PR, generate size diff
DIFF_BASE="origin/${KOKORO_GITHUB_PULL_REQUEST_TARGET_BRANCH}" DIFF_BASE="origin/${KOKORO_GITHUB_PULL_REQUEST_TARGET_BRANCH}"
else else
# running as continous build, only generate numbers for the current revision # running as continuous build, only generate numbers for the current revision
DIFF_BASE="" DIFF_BASE=""
fi fi

@ -41,7 +41,7 @@ ruby --version
# log gem versions for easier debugging if things go wrong # log gem versions for easier debugging if things go wrong
gem list || true gem list || true
# avoid polluting the global gem diretory # avoid polluting the global gem directory
# by configuring "bundle install" to install all the gems # by configuring "bundle install" to install all the gems
# into a project-local directory # into a project-local directory
export BUNDLE_PATH=bundle_local_gems export BUNDLE_PATH=bundle_local_gems

@ -321,7 +321,7 @@ class CXXLanguage(Language):
) )
# Scenario was added in https://github.com/grpc/grpc/pull/12987, but its purpose is unclear # Scenario was added in https://github.com/grpc/grpc/pull/12987, but its purpose is unclear
# (beyond excercising some params that other scenarios don't) # (beyond exercising some params that other scenarios don't)
yield _ping_pong_scenario( yield _ping_pong_scenario(
"cpp_protobuf_async_unary_75Kqps_600channel_60Krpcs_300Breq_50Bresp", "cpp_protobuf_async_unary_75Kqps_600channel_60Krpcs_300Breq_50Bresp",
rpc_type="UNARY", rpc_type="UNARY",

@ -310,7 +310,7 @@ if __name__ == "__main__":
"test_target": action["id"]["targetId"], "test_target": action["id"]["targetId"],
"test_class_name": "N/A", "test_class_name": "N/A",
"test_case": "N/A", "test_case": "N/A",
"result": "UNPARSEABLE", "result": "UNPARSABLE",
"timestamp": "N/A", "timestamp": "N/A",
}, },
} }

@ -554,7 +554,7 @@ def wait_until_dns_server_is_up(dns_server_ip):
time.sleep(0.1) time.sleep(0.1)
raise Exception( raise Exception(
"Failed to reach DNS server over TCP and/or UDP. " "Failed to reach DNS server over TCP and/or UDP. "
"Exitting without running tests." "Exiting without running tests."
) )

@ -528,7 +528,7 @@ profile_output_files = []
# Note the base names of perf text reports are used when creating and processing # Note the base names of perf text reports are used when creating and processing
# perf data. The scenario name uniqifies the output name in the final # perf data. The scenario name uniqifies the output name in the final
# perf reports directory. # perf reports directory.
# Alos, the perf profiles need to be fetched and processed after each scenario # Also, the perf profiles need to be fetched and processed after each scenario
# in order to avoid clobbering the output files. # in order to avoid clobbering the output files.
def run_collect_perf_profile_jobs( def run_collect_perf_profile_jobs(
hosts_and_base_names, scenario_name, flame_graph_reports hosts_and_base_names, scenario_name, flame_graph_reports

@ -320,7 +320,7 @@ class CLanguage(object):
self._cmake_architecture_windows = ( self._cmake_architecture_windows = (
"x64" if self.args.arch == "x64" else "Win32" "x64" if self.args.arch == "x64" else "Win32"
) )
# when builing with Ninja, the VS common tools need to be activated first # when building with Ninja, the VS common tools need to be activated first
self._activate_vs_tools_windows = activate_vs_tools self._activate_vs_tools_windows = activate_vs_tools
# "x64_x86" means create 32bit binaries, but use 64bit toolkit to secure more memory for the build # "x64_x86" means create 32bit binaries, but use 64bit toolkit to secure more memory for the build
self._vs_tools_architecture_windows = ( self._vs_tools_architecture_windows = (

@ -123,7 +123,7 @@ def parse_test_cases(arg):
test_cases = test_cases.union([arg]) test_cases = test_cases.union([arg])
if not all([test_case in all_test_cases for test_case in test_cases]): if not all([test_case in all_test_cases for test_case in test_cases]):
raise Exception("Failed to parse test cases %s" % arg) raise Exception("Failed to parse test cases %s" % arg)
# Perserve order. # Preserve order.
return [x for x in all_test_cases if x in test_cases] return [x for x in all_test_cases if x in test_cases]
@ -2179,7 +2179,7 @@ def test_circuit_breaking(
""" """
Since backend service circuit_breakers configuration cannot be unset, Since backend service circuit_breakers configuration cannot be unset,
which causes trouble for restoring validate_for_proxy flag in target which causes trouble for restoring validate_for_proxy flag in target
proxy/global forwarding rule. This test uses dedicated backend sevices. proxy/global forwarding rule. This test uses dedicated backend services.
The url_map and backend services undergoes the following state changes: The url_map and backend services undergoes the following state changes:
Before test: Before test:

Loading…
Cancel
Save