My tests take 2 minutes to run:
$ py.test
================================================= test session starts =================================================
platform linux2 -- Python 2.7.8 -- py-1.4.24 -- pytest-2.5.2
plugins: cov, xdist
collected 2249 items
«lots of file names»
====================================== 2242 passed, 7 skipped in 120.01 seconds =======================================
…so I thought I'll try the xdist
plugin to run them in parallel. So, I did:
$ pip install pytest-xdist
$ py.test -n 2
================================================= test session starts =================================================
platform linux2 -- Python 2.7.8 -- py-1.4.24 -- pytest-2.5.2
plugins: cov, xdist
gw0 [2249] / gw1 [2249]
scheduling tests via LoadScheduling
================================================== in 2.65 seconds ===================================================
2 seconds would be a marvelous speedup… though I kind of think no tests are run—some dots would show up, wouldn't they? However, if I do "parallel" run with just a single process…
$ py.test -n 1
================================================= test session starts =================================================
platform linux2 -- Python 2.7.8 -- py-1.4.24 -- pytest-2.5.2
plugins: cov, xdist
gw0 [2249]
scheduling tests via LoadScheduling
....«lots and lots of dots»........
====================================== 2242 passed, 7 skipped in 122.27 seconds =======================================
…then the time's back to normal.
How can I make xdist
plugin actually run the tests?
UPDATE:
An answer to Bruno Oliveira question:
$ py.test -n 4 -vv
============================= test session starts ==============================
platform linux2 -- Python 2.7.8 -- py-1.4.24 -- pytest-2.5.2 -- /home/liori/proj/.ve/bin/python2
plugins: cov, xdist
[gw0] linux2 Python 2.7.8 cwd: /home/liori/proj/src
[gw1] linux2 Python 2.7.8 cwd: /home/liori/proj/src
[gw2] linux2 Python 2.7.8 cwd: /home/liori/proj/src
[gw3] linux2 Python 2.7.8 cwd: /home/liori/proj/src
[gw0] Python 2.7.8 (default, Aug 23 2014, 21:00:50) -- [GCC 4.9.1]
[gw1] Python 2.7.8 (default, Aug 23 2014, 21:00:50) -- [GCC 4.9.1]
[gw2] Python 2.7.8 (default, Aug 23 2014, 21:00:50) -- [GCC 4.9.1]
[gw3] Python 2.7.8 (default, Aug 23 2014, 21:00:50) -- [GCC 4.9.1]
gw0 [2254] / gw1 [2254] / gw2 [2254] / gw3 [2254]
scheduling tests via LoadScheduling
=============================== in 4.63 seconds ===============================
I haven't used randomized values for parametrizing my tests, as suggested by Marek. However, his suggestion pushed me towards checking another hypothesis, which seems to be true: xdist
requires that parametrizations for the tests are generated always in the same order.
In my specific case, I generated by parametrizations by iterating over a set
of strings. However, this iteration depends on the specific values the strings hash to, and these values might be different for every process. Therefore, whereas I was always generating exactly the same parametrizations, they were in different order.
A simple test case that shows the problem:
import pytest
my_names = {'john', 'kate', 'alfred', 'paul', 'mary'}
@pytest.mark.parametrize('name', list(my_names), ids=list(my_names))
def test_is_name_short(name):
assert len(name) < 7
Run with PYTHONHASHSEED=random py.test -n 4
to make sure you trigger randomized hashing for strings.
A simple workaround is to enforce a specific ordering on the tests, e.g. by sorting them by some parameter:
my_names = sorted(my_names)
I have submitted to py.test's bugtracker a suggestion to make xdist
sort parametrizations for comparison to avoid this problem.