The C based gRPC (C++, Python, Ruby, Objective-C, PHP, C#) https://grpc.io/
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
 
Vijay Pai 088112fea1 Create inproc version of QPS test and add a few simple tests of this to standard testing suite. 7 years ago
..
README.md Use https://grpc.io consistently as the canonical URL 7 years ago
__init__.py auto-fix most of licenses 8 years ago
bq_upload_result.py Upload core stats to qps_test 7 years ago
build_performance.sh exclude uploading stats, add unconstained php benchmark 7 years ago
build_performance_go.sh auto-fix most of licenses 8 years ago
build_performance_php7.sh exclude uploading stats, add unconstained php benchmark 7 years ago
kill_workers.sh auto-fix most of licenses 8 years ago
massage_qps_stats.py Add counters 7 years ago
massage_qps_stats_helpers.py Fix qps_test histogram python code 7 years ago
patch_scenario_results_schema.py add script for patching perf benchmark schema 8 years ago
process_local_perf_flamegraphs.sh auto-fix most of licenses 8 years ago
process_remote_perf_flamegraphs.sh auto-fix most of licenses 8 years ago
remote_host_build.sh auto-fix most of licenses 8 years ago
remote_host_prepare.sh auto-fix most of licenses 8 years ago
run_netperf.sh auto-fix most of licenses 8 years ago
run_qps_driver.sh auto-fix most of licenses 8 years ago
run_worker_csharp.sh auto-fix most of licenses 8 years ago
run_worker_go.sh auto-fix most of licenses 8 years ago
run_worker_java.sh auto-fix most of licenses 8 years ago
run_worker_node.sh Make various scripts use the same version of Node 8 years ago
run_worker_php.sh exclude uploading stats, add unconstained php benchmark 7 years ago
run_worker_python.sh auto-fix most of licenses 8 years ago
run_worker_ruby.sh auto-fix most of licenses 8 years ago
scenario_config.py Create inproc version of QPS test and add a few simple tests of this to standard testing suite. 7 years ago
scenario_result_schema.json Add counters 7 years ago

README.md

Overview of performance test suite, with steps for manual runs:

For design of the tests, see https://grpc.io/docs/guides/benchmarking.html.

Pre-reqs for running these manually:

In general the benchmark workers and driver build scripts expect linux_performance_worker_init.sh to have been ran already.

To run benchmarks locally:

On remote machines, to start the driver and workers manually:

The run_performance_test.py top-level runner script can also be used with remote machines, but for e.g., profiling the server, it might be useful to run workers manually.

  1. You'll need a "driver" and separate "worker" machines. For example, you might use one GCE "driver" machine and 3 other GCE "worker" machines that are in the same zone.

  2. Connect to each worker machine and start up a benchmark worker with a "driver_port".

Comands to start workers in different languages:

Running benchmark workers for C-core wrapped languages (C++, Python, C#, Node, Ruby):
  • These are more simple since they all live in the main grpc repo.
$ cd <grpc_repo_root>
$ tools/run_tests/performance/build_performance.sh
$ tools/run_tests/performance/run_worker_<language>.sh
Running benchmark workers for gRPC-Java:
$ cd <grpc-java-repo>
$ ./gradlew -PskipCodegen=true :grpc-benchmarks:installDist
$ benchmarks/build/install/grpc-benchmarks/bin/benchmark_worker --driver_port <driver_port>
Running benchmark workers for gRPC-Go:
$ cd <grpc-go-repo>/benchmark/worker && go install
$ # if profiling, it might be helpful to turn off inlining by building with "-gcflags=-l"
$ $GOPATH/bin/worker --driver_port <driver_port>

Build the driver:

  • Connect to the driver machine (if using a remote driver) and from the grpc repo root:
$ tools/run_tests/performance/build_performance.sh

Run the driver:

  1. Get the 'scenario_json' relevant for the scenario to run. Note that "scenario json" configs are generated from scenario_config.py. The driver takes a list of these configs as a json string of the form: {scenario: <json_list_of_scenarios> } in its --scenarios_json command argument. One quick way to get a valid json string to pass to the driver is by running the run_performance_tests.py locally and copying the logged scenario json command arg.

  2. From the grpc repo root:

  • Set QPS_WORKERS environment variable to a comma separated list of worker machines. Note that the driver will start the "benchmark server" on the first entry in the list, and the rest will be told to run as clients against the benchmark server.

Example running and profiling of go benchmark server:

$ export QPS_WORKERS=<host1>:<10000>,<host2>,10000,<host3>:10000
$ bins/opt/qps_json_driver --scenario_json='<scenario_json_scenario_config_string>'

Example profiling commands

While running the benchmark, a profiler can be attached to the server.

Example to count syscalls in grpc-go server during a benchmark:

  • Connect to server machine and run:
$ netstat -tulpn | grep <driver_port> # to get pid of worker
$ perf stat -p <worker_pid> -e syscalls:sys_enter_write # stop after test complete

Example memory profile of grpc-go server, with go tools pprof:

  • After a run is done on the server, see its alloc profile with:
$ go tool pprof --text --alloc_space http://localhost:<pprof_port>/debug/heap