I know there are plugins for performance tests and profiling for py.test but is there a way to generate arbitrary values which are reported or somehow accessible after the test?
Imagine I have a test like this
def test_minimum_learning_rate():
"""Make some fancy stuff and generate a learning performance value"""
learning_rate = fancy_learning_function().rate
pytest.report("rate", learning_rate)
assert learning_rate > 0.5
The pytest.report(..)
line is what I'd like to have (but isn't there, is it?)
And now I'd like to have something like minimum_learning_rate[rate]
being written along with the actual test results to the report (or maybe at least on the screen).
Really nice would be some plugin for Jenkins which creates a nice chart from that data.
Is there a typical wording for this? I've been looking for kpi
, arbitrary values
, user defined values
but without any luck yet..
If you just want to output some debug values, a print
call combined with the -s
argument will already suffice:
def test_spam():
print('debug')
assert True
Running pytest -s
:
collected 1 item
test_spam.py debug
.
If you are looking for a solution that is better integrated into pytest
execution flow, write custom hooks. The examples below should give you some ideas.
# conftest.py
def pytest_report_teststatus(report, config):
if report.when == 'teardown': # you may e.g. also check the outcome here to filter passed or failed tests only
rate = getattr(config, '_rate', None)
if rate is not None:
terminalreporter = config.pluginmanager.get_plugin('terminalreporter')
terminalreporter.ensure_newline()
terminalreporter.write_line(f'test {report.nodeid}, rate: {rate}', red=True, bold=True)
Tests:
def report(rate, request):
request.config._rate = rate
def test_spam(request):
report(123, request)
def test_eggs(request):
report(456, request)
Output:
collected 2 items
test_spam.py .
test test_spam.py::test_spam, rate: 123
test_spam.py .
test test_spam.py::test_eggs, rate: 456
===================================================== 2 passed in 0.01 seconds =====================================================
# conftest.py
def pytest_configure(config):
config._rates = dict()
def pytest_terminal_summary(terminalreporter, exitstatus, config):
terminalreporter.ensure_newline()
for testid, rate in config._rates.items():
terminalreporter.write_line(f'test {testid}, rate: {rate}', yellow=True, bold=True)
Tests:
def report(rate, request):
request.config._rates[request.node.nodeid] = rate
def test_spam(request):
report(123, request)
def test_eggs(request):
report(456, request)
Output:
collected 2 items
test_spam.py ..
test test_spam.py::test_spam, rate: 123
test test_spam.py::test_eggs, rate: 456
===================================================== 2 passed in 0.01 seconds =====================================================
Using the record_property
fixture:
def test_spam(record_property):
record_property('rate', 123)
def test_eggs(record_property):
record_property('rate', 456)
Resulting report:
$ pytest --junit-xml=report.xml
...
$ xmllint --format report.xml
<testsuite errors="0" failures="0" name="pytest" skipped="0" tests="2" time="0.056">
<testcase classname="test_spam" file="test_spam.py" line="12" name="test_spam" time="0.001">
<properties>
<property name="rate" value="123"/>
</properties>
</testcase>
<testcase classname="test_spam" file="test_spam.py" line="15" name="test_eggs" time="0.001">
<properties>
<property name="rate" value="456"/>
</properties>
</testcase>
</testsuite>