Pytest Integration

Applies to Zephyr Scale Cloud, last modified on April 02, 2021.

Intro

Pytest is the standard test automation framework for Python. The integration with Zephyr Scale works through uploading a JUnit XML results file generated by pytest to Zephyr Scale. Once the file is uploaded to Zephyr Scale, it is possible to generate reports and leverage all the other capabilities that Zephyr Scale offers.

For more info on pytest, see here.

Writing pytest tests

See below for an example of how to write test cases with pytest and Zephyr Scale.


class TestClass:

    # will match with test case with key DEV-T21
    def test_method_3_DEV_T21(self):
        assert 5 == 5

    # will match with test case named tests.test_sample.TestClass.test_method
    def test_method(self):
        x = 'this'
        assert 'h' in x


# will match with test case named tests.test_sample.test_method_1_without_class
def test_method_1_without_class():
    assert 1 == 1

Note the comments above each method describing how they map to existing test cases within Zephyr Scale. This varies depending on whether a test case key or name is provided, as shown above. 

If any test case, prefixed with the test case key or not, is not found, an error will be returned when the results are uploaded to Zephyr Scale. In this case, if you intend to automatically create test cases that don’t exist previously, a flag autoCreateTestCases=true can be set when uploading test results to Zephyr Scale.

To summarize, the rules for matching a test method to a test case are:

  • Try to match by parsing a test case key from the test method name

  • If no test case is matched, then try to match a test case by using the test name following the pattern <package name>.<class name (if using classes)>.<method name>

  • If no test case is matched, then:

    • If the parameter autoCreateTestCases is true, create the test case and create a new test execution for it

    • If the parameter autoCreateTestCases is false or not present, return an error - no test cases have been matched

Configuring pytest to output JUnit XML results file

No configuration is required beforehand. In order to instruct pytest to generate the JUnit XML results file, all that is required is to execute the tests with --junitxml parameter followed by the XML file name. Here is an example:


pytest --junitxml=output/junitxml_report.xml

Uploading results to Zephyr Scale

Once the pytest tests have executed, the results can be uploaded to Zephyr Scale. There are two options for uploading test results:

  • Using a single JUnit XML results file generated by pytest

  • Using a zip file containing multiple JUnit XML results file generated by pytest

Both options work the same way and can be uploaded using the same API endpoint: https://support.smartbear.com/zephyr-scale-cloud/api-docs/#operation/createJUnitExecutions.

Below, an example using curl of how to use the API endpoint for uploading one single file:


curl -H "Authorization: Bearer ${TOKEN}" -F "file=@TestSuites.xml;type=application/xml" https://api.adaptavist.io/tm4j/v2/automations/executions/junit?projectKey="JQA"&autoCreateTestCases=true

…and for uploading a zip file containing multiple XML files:


curl -H "Authorization: Bearer ${TOKEN}" -F "file=@TestSuites.zip;type=application/x-zip-compressed" https://api.adaptavist.io/tm4j/v2/automations/executions/junit?projectKey="JQA"&autoCreateTestCases=true

Note the query parameters on the URL. The projectKey specifies what project will be used to create the test cycle. The autoCreateTestCase will create a test case using the pytest test method name when no matching test case is found.

Code samples

SmartBear/zephyr-scale-pytest-example 

Suggested Improvements

Is something missing? If you’d like to suggest an improvement to this integration, take a look at our Ideas Portal

You can search for existing ideas or create your own. We review ideas regularly and will take them into account for future development. 

Highlight search results