pytestpytest-dependencypytest-markers

run dependant tests with pytest


With pytest, I'm setting dependencies using the library pytest-dependency. I also add markers to those tests. Here is an ECM :

# test_test.py
import pytest

@pytest.mark.dependency()
def test_a():
   assert True

@pytest.mark.category
@pytest.mark.dependency(depends=['test_a'])
def test_b():
   assert True

with the pytest.ini file to set the marker :

; pytest.ini
[pytest]
markers =
    category: category of tests.

When I try to run the test with the marker, as it is dependant on test_a which doesn't have the marker category, it is skipped :

user@pc → [~/Documents/test] $ pytest -vv -k category
============================================== test session starts ===============================================
platform darwin -- Python 3.9.8, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /usr/local/opt/python@3.9/bin/python3.9
cachedir: .pytest_cache
rootdir: /Users/saigre/Documents/test, configfile: pytest.ini
plugins: dependency-0.5.1
collected 2 items / 1 deselected / 1 selected

test_test.py::test_b SKIPPED (test_b depends on test_a)                                                    [100%]

======================================== 1 skipped, 1 deselected in 0.05s ========================================

Is there a way to force the run of test_a because of the dependency. A solution would be to add the marker to the first test, but it would be complicated for the case I'm working on...

EDIT for @MrBean Bremen : I made an example of the scheme of dependency scheme If I want to add a marker on a test, I would have to put this marker on all the branches, and the "root" would have many markers. It is not that it is complicated to do, but tedious.


Solution

  • You can try this (something similar I have implemented):

    Put all your test from the same category in a class, which inherits from a base class that has test_a. Put each class in a separate file, and pass that specific file to pytest. This way, you can get rid of markers as well.

    Consider a pseudo example:

    # tests/test_a.py #
    import pytest
    
    class TestA:
    
        @pytest.mark.dependency(name="TEST-A")
        def test_a(self):
            print("TEST-A was called")
            assert False
    
    
    # tests/test_b.py #
    import pytest
    from tests.test_a import TestA
    
    class TestB(TestA):
    
        def test_b1(self):
            print("TEST-B1 was called")
            assert True
    
        @pytest.mark.dependency(depends=["TEST-A"])
        def test_b2(self):
            print("TEST-B2 was called")
            assert True
    

    Now, on CLI, just do this: $ pytest tests/test_b.py

    tests/test_b.py::TestB::test_a <- tests/test_a.py FAILED                      [ 33%]
    tests/test_b.py::TestB::test_b1 PASSED                                        [ 66%]
    tests/test_b.py::TestB::test_b2 SKIPPED (test_b2 depends on TEST-A)           [100%]
    
    ===================================== FAILURES ======================================
    ___________________________________ TestB.test_a ____________________________________
    
    self = <tests.test_b.TestB object at 0x7fc21886db20>
    
        @pytest.mark.dependency(name="TEST-A")
        def test_a(self):
            print("TEST-A was called")
    >       assert False
    E       assert False
    
    tests/test_a.py:8: AssertionError
    ------------------------------- Captured stdout call --------------------------------
    TEST-A was called
    ====================================== PASSES =======================================
    ___________________________________ TestB.test_b1 ___________________________________
    ------------------------------- Captured stdout call --------------------------------
    TEST-B1 was called
    ============================== short test summary info ==============================
    PASSED tests/test_b.py::TestB::test_b1
    SKIPPED [1] ...: test_b2 depends on TEST-A
    FAILED tests/test_b.py::TestB::test_a - assert False
    ====================== 1 failed, 1 passed, 1 skipped in 0.19s =======================