pythondockersubprocessgithub-actionspytest

run docker command within python subprocess within GitHub runner


I am running functional tests as CI in my GitHub runner. To properly execute my tests, I am building a docker image and running it (a slightly modified mysql container). Because I am building the container from a Dockerfile (not only pulling it from a registry), I dont use GitHub service or container capabilities (https://docs.github.com/en/actions/use-cases-and-examples/using-containerized-services/about-service-containers).

Instead I have some steps dedicated in my GH workflow to build the container and run it like this

  - name: build docker container
    run: |
      cd tests/functional/olbp/mocked_data/
      docker build --no-cache -f my_service.Dockerfile -t my_service_test:latest .
  - name: run container
    run: docker run --name my_service -d -p 127.0.0.1:3306:3306 my_service_test --sql_mode="NO_ENGINE_SUBSTITUTION"
  - name: wait for MySQL to start
    run: |
      until mysqladmin ping -h 127.0.0.1 --silent; do
      echo 'Waiting for MySQL to start...'
      sleep 1
      done

Finally I am calling my functional tests (pytest) with python

  - name: run the functional tests
    run: python -m pytest tests/functional -vv -s --log-cli-level=DEBUG

This is working fine, my tests can obvisouly interract with the mysql container.

However I am struggling to interract with Docker in this context. I would like to generate a mysqldump part of my tests. So I wrote this inside my tests:

dck_exc_output = subprocess.run(
    [
        "docker",
        "exec",
        "my_service",
        "mysqldump",
        "--skip-triggers",
        "--skip-extended-insert",
        "--compact",
        "--no-create-info",
        "-uroot",
        "-psupersecret",
        "my_db",
        "my_table",
        ">",
        live_data_dump,
    ],
    capture_output=True,
    shell=True,
)

logging.debug(dck_exc_output)

It is working fine when being executed from my computer but whenever I try to get this executed inside the GH Runner (this docker command or even a simple docker version). I end up on this output (output for subprocess of a simple docker version via subprocess):

DEBUG    root:my_test.py:246 CompletedProcess(args=['docker', 'version'], returncode=0, stdout=b'', stderr=b'
Usage:  docker [OPTIONS] COMMAND

...
')

I already confirmed that I have same user, same permission and same env variable between the subprocess and the GH runner commands.
I cannot understand why I am receiving this output. The docker executable seems to be found (because I am receiving the helppage as if I was executing a wrong command) but it will not execute as it is doing when I run the test locally.


Solution

  • You have an error in your Python code. You're attempting to use shell i/o redirection in your arguments to subprocess.run(), but (a) you're not setting shell=True and (b) you're passing in a list of arguments rather than a string.

    If you want to use shell i/o redirection (like somecommand > somefile), you may only pass a single string as your argument to subprocess.run and you must set shell=True:

    subprocess.run(
    f'docker exec my_service mysqldump --skip-triggers --skip-extended-insert --compact --no-create-info -uroot -psupersecret my_db my_table > {live_data_dump}',
    shell=True
    )
    

    If you don't set shell=True, then your command is not executed by a shell, and things like > are passed literally to the executing command.