Applies to LoadNinja 1.0, last modified on March 13, 2019

Compare the results of multiple test runs to determine the common errors that occur when working with the tested website, to find out how the website behaves under various load conditions, and so on.

Start comparing results

  1. Click your project name in Projects.

  2. Switch to the Test Runs tab.

  3. On this tab, in the left column, select up to 4 check boxes for the reports you want to compare:

    Test Runs tab: Compare runs button

    Click the image to enlarge it.

    Tip: We recommend that you select the runs that are associated with the same web tests. It is possible to select runs for different web tests, but in this case, the comparison results will be of no use.
  4. Once ready, click Compare. The comparison report screen will appear:

    Comparison report screen

    Click the image to enlarge it.

Comparison report screen

Switch between the tabs on the comparison report screen to retrieve information on different comparison aspects:

  • Summary – Shows basic information about each test run included in the comparison, including data on separate web tests.

  • Charts – Shows customizable comparison charts.

  • Response times – Shows the response times metrics – average, minimum, maximum, standard deviation, and think times. It also includes navigation timings.

  • Errors – Shows the overall number of errors for each test and metric. The errors are grouped by their type.

To learn more about each tab, see below.

Download the comparison report

  1. Click Download Report in the top right corner of the screen:

    Comparison Report: Download button

    Click the image to enlarge it.

  2. In the subsequent dialog, specify a name for the report file and select the preferable file format in Save Type:

    Comparison Report: Download dialog

    Click the image to enlarge it.

    Tip: LoadNinja supports saving reports in the PDF, PNG, and JPG formats.
  3. Once ready, click Download to generate a report and save it on your device.

Reference

In this section, learn about the metrics and fields LoadNinja shows in comparison reports.

Comparison Report: Summary tab

Click the image to enlarge it.

Test run comparison panel

This panel shows the following information on each test run:

Comparison Report: Summary tab

Click the image to enlarge it.

Column Description
Test run name The name of the scenario used for the test run.
Started test The date and time the test run started.
Completed test The date and time the test run ended.
Duration configured The preconfigured duration of the test, in minutes. Used for duration-based tests.
Total run time

The total run time of the test, in minutes.

Ramp-up time The “warm-up” period of the load test, in minutes.
Delay between iterations The preconfigured pause between web test runs performed by a virtual user, in seconds.
Average response time The amount of time it takes the website to return the results of a request to a virtual user.
VUs configured The number of virtual users requested for the test run.
VUs run The actual number of virtual users LoadNinja provided.
Tip: Usually, this number differs from the VU configured value if a warm-up time is required (that is, the test starts with 1 virtual user, and the number of users will increase over time), but some error caused the test to end prematurely.
Iterations configured The preconfigured number of iterations (for iteration-based tests).
Iterations completed The actual number of iterations completed by virtual users.
Script comparison panel

This panel shows the following information on each web test:

Column Description
Scripts run The web tests selected for the test run.
Number of scripts The number of web tests associated with the test run.
Number of steps The overall number of steps in all the completed iterations of the web test.

This tab shows customizable charts of separate web tests or steps:

Comparison Report: Charts tab

Click the image to enlarge it.

To add a chart
  1. Click Add a chart.

  2. Select the needed web tests or steps to include in the comparison:

    Add a Chart dialog: Select web tests and steps

    Click the image to enlarge it.

    Tip: Use the search box to find the needed item quickly.

    Optionally, change the chart’s name in Chart Title, then click Next: Select Metric.

  3. Select the metrics you want to include in the comparison. Possible options include: the number of virtual users, the total response time, think time, error counts, and navigation timings:

    Add a Chart dialog: Select metrics

    Click the image to enlarge it.

    Once ready, click Add chart.

To edit a chart
  1. Select  > Edit Chart next to the needed chart name on the Charts tab.

  2. Use the subsequent dialog box to modify the chart. The available options are the same as when adding the chart.

  3. Once ready, click Add a chart.

To remove a chart
  1. Select  > Remove Chart next to the needed chart name on the Charts tab.

  2. Confirm this action in the subsequent dialog.

Working with charts
  • To show only one graph, double-click the name of this graph in the chart legend.

  • To stop showing a graph, click the name of its graph in the chart legend once.

  • To zoom in a segment of a chart, select it with your mouse.

  • To restore the default scaling of a chart, double-click on its legend.

This tab shows the response time metrics and navigation timings. They are grouped by web tests and steps associated with the compared runs:

Comparison Report: Response Times tab

Click the image to enlarge it.

Navigation timings

LoadNinja counts the following navigation timings, one after another:

Load test structure

Click the image to enlarge it.

Each timing is counted separately. The total time is the overall duration of the load test.

Column Description
Average Response Time The average time it took the server that hosts the website to provide the response within the web test or step, in seconds.
Minimum Response Time The minimum time it took the server that hosts the website to provide the response within the web test or step, in seconds.
Maximum Response Time The maximum time it took the server that hosts the website to provide the response within the web test or step, in seconds.
Standard Deviation Response Time The standard deviation of response times.
Think Time A sum of pauses between actions within the web test or step.
Total Time The total time it took to run the web test or step.
Redirect Time If the tested website initially responded with an HTTP 301 or 302 redirect, this value indicates the time the virtual users spent following the redirects until getting to the final URL.
DNS Time The time spent on performing a DNS lookup, that is, on obtaining the IP address of the website from a DNS server.
Connect Time The time spent on performing a TCP handshake, that is, on establishing a connection to the web server after the DNS lookup.
DOM Load Time The total time it took to load and construct the DOM. The DOM is considered completed when the DOMContentLoaded event starts.
Response Time The total time passed from the moment the client sent a composed request until the moment the page’s HTML code was downloaded.
First Byte Time The time spent on waiting for the first byte of the response from the server. Its duration includes processing of the requests, accessing the database of the server, and selecting and generating the response.
Event Time The total time it took to fully load the page along with the required resources to the Document Object Model (DOM).

This tab shows the following error metrics grouped by web tests and steps associated with the compared runs:

Comparison Report: Errors tab

Click the image to enlarge it.

Column Description
Total errors The number of errors (of all types) that occurred during the run of this script or step.
Assertion errors The number of errors triggered by validations.
Timeouts The number of timeout errors occurred during the run of this script or step.
Page errors The number of page errors (for example, objects not found) occurred during the run of this script or step.

See Also

About Results Screen
Test Results

Highlight search results