Use the test runner to run functional tests from the command line.
You can find the runner in the <ReadyAPI>/bin directory. The file name is testrunner.bat (Windows) or testrunner.sh (Linux and macOS).
To configure the runner’s command line visually, start it from the ReadyAPI user interface. For more information, see Test Runner GUI.
General Syntax
The runner’s command line has the following format:
Required arguments
test-project
The fully qualified path to the project that contains the functional tests to be run. If the file name or path includes spaces, enclose the entire argument in quotes.
Examples:
C:\Work\composite-project
--accessKey, -K<Access Key>
Specifies the Access Key for authenticating the SmartBear account when running tests. This is a mandatory parameter for environments using SSO or headless executions, such as TestRunners or CI tools. The key must be generated from the SmartBear Licensing Portal.
For instructions on how to get the access key, see Get Access Key in the Licensing Portal documentation.
Example:
Note: If you use SSO (Single Sign-On) to log in to your SmartBear account, you must use the access key. If you're unsure whether you use SSO, consult your license manager for assistance.
Note: | SmartBear License Management's license authentication and request flows were revised in November 2023. Authentication using the username and password is no longer supported for test execution via TestRunners, Jenkins or Azure plugins, and SmartBear-hosted licenses. Username and password are limited to on-prem-hosted licenses only. Access Key is the required option for authentication if using SmartBear-hosted licenses headlessly. |
Optional arguments
-a
Commands the runner to export all test results. Otherwise, it exports only information about errors.
-A
Commands the runner to organize files with test results into directories and subdirectories that match the test suites and test cases the runner executes. To specify the root directory, use the ‑f
argument.
If you skip the -A
argument, the runner will save all resulting files to the directory the ‑f
argument specifies. The file names will include the test suite’s and test case’s names.
The runner can ignore this attribute depending on the -R
command-line argument value.
-c<test case>
Specifies the test case to be run. If you skip this argument, the runner will execute all test cases in your test project.
Note: | If you need to specify several test cases to be run, use the -T argument. |
Example:
--clientId, -ci<Client ID>
Specifies the Client ID for SLM On-premise license Server when configured for OIDC. Please see here for instructions on configuring SLM on-premise for OIDC and how to access the credentials.
Note: SLM on-premise 2.0 or later must be configured for the OIDC/Okta environment.--clientSecret, -cs<Client Secret>
Specifies the Client Secret for SLM On-premise license Server when configured for OIDC. Please see here for instructions on configuring SLM on-premise for OIDC and how to access the credentials.
Note: SLM on-premise 2.0 or later must be configured for the OIDC/Okta environment.-D
Specifies the URL for SLM On-premise license Server. Please see here for instructions on configuring SLM on-premise for OIDC and how to access the credentials.
Usage: Server Address -DlicenseApiHost=<SLM_License_Server_Address>
.
Server Port -DlicenseApiPort=<SLM_License_Server_Port>
.
-d<domain>
Specifies the domain that ReadyAPI will use for authorization.
This argument overrides the authorization domain you have specified for test steps in your project.
-D<args>
Specifies a value of a system property for the test run. This value will override the variable’s value during the run.
Usage: -D<variable>=<value>
. If the value includes spaces, enclose the entire argument in quotes. To override several variable values, specify the -D
argument several times.
Example:
-e<endpoint>
Specifies the endpoint to be used in test requests. The specified endpoint should include the protocol part (for example, https://
).
This argument overrides the endpoints you have specified for test steps in your project. See the -h
argument description.
Note: | The runner ignores this parameter if -E is specified. In this case, the endpoint is taken from the environment settings. |
Example:
-E<environment>
-f<directory>
Specifies the root directory, where the runner will save test result files. If the directory does not exist, ReadyAPI will create it.
Tip: | If the directory exists, ReadyAPI will overwrite report files in it. |
Example:
-F<args>
Specifies the format of the reports ReadyAPI exports.
Usage: -F<FormatName>
. ReadyAPI supports the following formats: PDF, XLS, HTML, RTF, CSV, TXT and XML. If you have not specified the parameter, ReadyAPI will use the PDF format.
To export results in several formats, separate them with commas. For example: ‑FPDF,XML,CSV
.
The runner can ignore the -F
argument depending on the value of the -R
argument. See the -R
argument description.
-g
Commands the runner to create a coverage report (HTML format).
-G<args>
Specifies a value of the global property for the test run. This value will override the variable’s value during the run.
Usage: -G<variable>=<value>
. If the value includes spaces, enclose the entire argument in quotes. To override several variable values, specify the -G
argument several times.
Example:
-h<host:port>
Specifies the host and port to be used in test requests.
Usage: -h<host>:<port>
. You can specify the host by using its IP address or name.
This argument overrides the endpoints you have specified in the project file. See the -e
argument description.
Example:
-H<args>
Use this argument to add a custom HTTP header to all simulated requests.
Usage: -H<header>=<value>
. To add several headers, specify the -H
argument several times.
Example:
-i
Commands the runner to enable UI components. Use this command-line argument if you use the UISupport
class in your tests.
We have identified a bug with the TestRunner when using the -i argument.
This argument currently fails to initialize a license and requires a fix.
Please refrain from using the -i argument until we confirm the issue has been resolved. |
-I
Commands the runner to ignore errors. If you put this argument to the command line, the test log will contain no information on errors occurred during the test run.
If you skip this argument, the runner will stop the run after the first error occurs and will post full information about the error to the log.
-j
Commands the runner to create JUnit-compatible reports. This argument is similar to the "-RJUnit-style HTML Report"
command-line argument.
-J
Commands the runner to include JUnit XML reports with test properties to the report. For example:
<properties>
<property name="BusinessRequirementId" value="BusinessRequirement1"/>
</properties>
</testcase>
-M
-o
Commands the runner to open the reports ReadyAPI created in your default web browser after the test run is over.
-O
Commands the runner to renounce collecting and sending usage statistics.
-P<args>
Specifies a value of a project property for the test run. This value will override the variable’s value during the run.
Usage: -P<variable>=<value>
. If the value includes spaces, enclose the entire argument in quotes. To override several variable values, specify the -P
argument several times.
Example:
-p<password>
Specifies the password to be used for authorization during the run.
This argument overrides the authorization password settings you have specified in your project.
Note: | The runner ignores this parameter if -E is specified. In this case, the password is taken from the environment settings. |
-R<args>
Specifies the type of the report data.
Usage: -R<Report type>
. The Report type value can be one of the following:
-
Project Report
: Generates a report in the format the-F
argument specifies. The runner will save the report files to the directory that the-f
argument specifies. Depending on the-A
argument value, the runner can organize files into subdirectories. -
Test Suite Report
: As above, but for test suites. -
Test Case Report
: As above, but for test cases. -
JUnit-Style HTML Report
: Generates a report as JUnit-style HTML files. See JUnit-Style HTML Reports For Automation. When you use this value, the runner ignores the-F
and-A
arguments. -
Data Export
: Generates XML files with report data. See Data Export For Automation. -
Allure Report
: Generates Allure results. Use the Allure framework to generate an actual report. See Allure Report.
When you use this argument, the -F
argument must be XML or not specified.
Use the -f
argument to specify the directory, where the runner will save report files ReadyAPI creates.
Example:
-r
Commands the runner to include a summary report into the test log.
-s<test suite>
Specifies the test suite to be run. If you skip this argument, the runner will execute all test suites in your project.
Note: | If you need to specify several test suites to be run, use the -T argument. |
Example:
-S
Commands the runner to save the test project after the test run finishes. Use this command-line argument if you store data within the project during the test.
-t<settings file>
Specifies the ReadyAPI setting file to be used during the test run. If you skip this command-line argument, the runner will use the default readyapi-settings.xml file which you can find in the <User directory>/.readyapi directory.
Use this argument to specify another setting file for the run. It helps you use different proxy, SSL, HTTP and other settings without changing them in ReadyAPI.
Also, see the -v
argument description.
-T<args>
Specifies the test suites or test cases to be executed by their tags.
Usage: -T"<Test Item> <Tags>"
, where:
-
Test Item: The test item type for which you specify a condition. Possible values:
TestCase
orTestSuite
. -
Tags: The expression that specifies the required tags.
To build a complex conditional, use the following symbols:
||
Logical OR The test case or test suite must contain at least one of the specified tags. &&
,,
Logical AND The test case or test suite must contain all the specified tags. !
Logical NOT The test case or test suite must not contain the specified tags. It is possible to group conditions by using parentheses.
To specify both test suites and test cases for execution, use the -T argument twice.
The specified tags must be assigned to the test cases and test suites that you specify by using the -s and -c arguments. Otherwise, there will be no tests to execute.
|
Example:
-u<username>
Specifies the user name to be used in test request authorizations.
This argument overrides user names in your test project.
Note: | The runner ignores this parameter if -E is specified. In this case, the user name is taken from the environment settings. |
-v<password>
Specifies the password for your XML setting file.
Also, check the description of the -t
argument.
-w<args>
Specifies the WSS password type.
Usage: -w<password type>
, where <password type> can be one of the following:
-
Text
: Matches the PasswordText WSS password type. -
Digest
: Matches the PasswordDigest WSS password type.
-W
Commands the test runner to post test results to Slack.
Usage: -W<access token>/<channel(s)>
, where:
-
<access token> – Your Slack bot authentication token.
Tip: To get the token, open the Slack settings, switch to the OAuth & Permissions tab, and copy the Bot User OAuth Access Token. -
<channel(s)> – A comma-separated list of Slack channels in the
#channel-name
format, or the IDs of the users the test results will be sent to. You can specify both channels and user IDs.
To learn more, see Slack Integration.
-x<password>
Specifies the project password, if you have encrypted the entire project or some of its custom properties. See Protecting Sensitive Data.
Examples
-
The following command runs all the functional tests from the ReadyAPI project you have specified:
testrunner.bat "c:\my projects\my-project.xml" -
The following command runs the my TestCase test case in your project and creates HTML files in a JUnit-style format:
testrunner.bat -FPDF "-RJUnit-Style HTML Report" "-cmy TestCase" "c:\my projects\my-project.xml"
Known Issues
SPNEGO-Kerberos authorization will cause tests you start from the test runner to fail. To avoid this issue, add the following code to the end of the first set JAVA_OPTS
line in the ReadyAPI.bat file: