4

I'm trying to get a better understanding of how tox and GitLab CI (with docker runners) would work together, as they seem to have a bit of overlap in what each does. I think I may be missing something on exactly the purpose of tox as well.

Here's tox's stated purpose:

tox is a generic virtualenv management and test command line tool you can use for:

  • checking that your package installs correctly with different Python versions and interpreters
  • running your tests in each of the environments, configuring your test tool of choice
  • acting as a frontend to Continuous Integration servers, greatly reducing boilerplate and merging CI and shell-based testing.

That last item is what I want out of it when using GitLab CI. But if I'm using docker runners, the virtual env stuff seems extra, and redundant. I assume I'm not the first person to notice this, and there is a recommended way to configure tox to be less redundant. I wasn't able to find information on that so far, though.

What's confusing me is that both GitLab CI and tox setup and configure test environments, and then execute several different runners. I would like to use GitLab CI for most or all of that part, as it would enable better UI integration, and allow using multiple job runners. I could even use different python docker images with different versions, instead of virtualenvs. That makes me wonder if tox gives any benefit at all... Perhaps another way to think of this question could be "If using docker containers, does it make sense to also use virtualenvs?" Is that a good idea, or an anti-pattern?

What I've found though is just running tox in a single job for each job type (unit, lint, etc.). (I may have missed some more complex examples, and that's what I'm looking for.)

So I want to better understand how to get these two tools to work together, and if they even should - does GitLab CI do pretty much already do what tox tries to do, or is there a way to use any unique strengths of both without too much redundancy? It would be great if you knew of any examples of projects that used both tools well, and how/why that was. Also, any case studies you might know of would be great, to illustrate what issues might come up. Also, does this vary for application vs library development? (I'm currently focused on library development, but do both).

1

2 Answers 2

6

Our team is using poetry with configurations included in pyproject.toml for dependency management, and the Dockerfile looks like

FROM some image
COPY pyproject.toml .
COPY poetry.lock .
RUN poetry install --no-interaction --no-ansi --no-dev
COPY our_application_code /app/our_application_code

and this generated image is loaded in gitlab CI for testing purpose.

tox environment is defined in pyproject as:

[tool.tox]
legacy_tox_ini = """

[tox]
isolated_build = true
envlist = py36,py37

[testenv]
install_command = pip install --index-url <local maven url> {opts} {packages}
deps =
    pytest
    pytest-cov
commands =
    pytest --cov-config=.coveragerc --cov=our_application_code tests
"""

During the installation process, tox generates a series virtual environments, and installs dependencies for each environment. Running tox can be somehow viewed equivalently as

virtualenv .tox/my_env
source .tox/my_env/activate
(my_env) pip install some dependencies
(my_env) .tox/my_env/prepare_something.sh
(my_env) pytest .tox/my_env/tests_dir

So we can run test in different (Python) environments with the config like:

poetry run tox -e <python environment, e.g. py37> \
    -- <path to test file>:TestClassName.test_method_name

Sometimes one's code and test is just not working for others with their environment/dependencies due to some incompatibility issue or improper env settings. With tox we can set up this proper CI process with automated build system of choice.

3
  • Good examples and overview. I'm still wondering though if it is good idea to use virtualenvs inside docker containers, as this example shows.... Commented Nov 20, 2020 at 21:26
  • My understanding is tox creates separate virtualenvs to reduce the chance of dependency conflicts, and simplifying the CI integration with the different virtualenvs is a side-benefit from there
    – lennon310
    Commented Nov 20, 2020 at 22:41
  • 1
    Right, and that essentially means that you don't get the benefit of using a docker container built with the right python version (which would mean no virtualenv was strictly necessary, because the container itself is as isolated as virtualenv). Commented Nov 24, 2020 at 22:58
5

I think you're asking two questions:

  1. Why use tox (or virtualenvs) rather than containers, and if I use both, isn't this unnecessary duplication?
  2. How to I best integrate gitlab runners and tox?

For 1, unless you expect to locally test inside docker, using tox makes it much easier to test your code locally. Therefore, to keep your CI as close to your local tests, you should use tox in your CI also.

For 2, tox has a plugin system - I'm not sure if someone has written a plugin for running on gitlab runners, but there are plugins for appveyor and github actions (and possibly other CI services), which you could use as inspiration for writing a plugin for gitlab runners.

1
  • I think you split it up nicely. Thanks for that :). I would like more detail on 1 though, if you can expand on the pros and cons, and talk about the duplication issue... Commented Dec 7, 2020 at 1:10

Not the answer you're looking for? Browse other questions tagged or ask your own question.