Currently when running python test on linux, we are mounting
/tmp/xdg-cache-home to the docker container that runs the tests
in the attempt to prevent unnecessary downloads of pip packages
(with the theory that more downloads leads to increased flakiness)
- the idea is that while there is a new docker container for each
test suite, the xdg cache remains per-VM.
This approach no longer seems to be useful:
* It turns out that XDG cache doesn't work reliably when multiple
docker containers are using it concurrently (the concurrent run can
see corrupt files). We are using concurrent docker containers in our
multilang test suite to speed up the execution and we are currently
getting flakes seeing flakes caused by this.
* support for caching makes our docker_run scripts more complicated
and we really don't want that.
* since we migrated to kokoro, the caching is limited anyway -
as each run gets a fresh VM, we need to download packages anyway
for every build (and that actually seems to causing way less
flakiness the problem with concurrent XDG caching).
-) Don't re-clone from github. We already have the directory here, just bind it, and copy it inside the docker container.
-) Let's properly set up our environment for asan.
-) Let's split the docker "run_jenkins" part into its own separate script.