Applies to LoadNinja 1.0, last modified on October 17, 2019

Use the results screen to analyze the results of your test runs, check virtual users during active test runs, debug errors, and gather statistics.

Toolbar

The toolbar contains:

  • The name of the scenario you used for this run.
  • The date and time when the run was performed.

In addition, check the test’s indices in a separate row:

  • Timer  — The time elapsed to run this test.

  •  — The maximum number of concurrent virtual users the test had at a time.

  •  — The number of errors occurred during the run.

  • Timer  — The average duration of the test step.

  •  — The average number of iterations for each test step.

Summary tab

The Summary tab shows a short overview of your test run:

Results: The Summary tab

Click the image to enlarge it.

  • In Goals, learn about the specific parameters that were tested during the test run.

  • In Key Findings, learn about the issues encountered during the test run and suggested improvements.

To edit goals and key findings

Summary per Script

This panel contains statistics on how successfully virtual users performed script runs.

Column Description
Script The name of the script the virtual users performed.
Load Distribution The share of the load you set for this script in scenario settings.
Expected Iterations For iteration-based tests. The number of iterations each virtual user was supposed to run.
Total Successful Steps The number of test steps that the virtual users completed successfully.
Failed The number of test steps that the virtual users failed.
Scenario information

This panel shows the scenario settings you specified for this load test run.

Charts tab

On the Charts tab, view charts that depict statistics on various aspects of the tested website.

LoadNinja offers the following charts:

Duration

Shows the average duration for running each script and step.

Navigation Timings

Shows the average time it took to move between the website’s pages, and locate and identify objects there.

Error Count

Shows the number of errors for each script and step.

Success Rate

Shows the number of successful iterations.

Statistics tab

On the Statistics tab, view metrics on timings and failure types. To get information on separate steps (URLs or synthetic steps) of a script, click the test in the list.

Statistics tab

Click the image to enlarge it.

The Durations and Errors table shows the metrics on how long the scripts and steps are performed on the tested service, and the number of various failures that occurred during the test run.

Low duration values are preferable, since the faster your service responds, the more requests per minute it can process.

The table includes the following columns:

Column Description
Script
Step Name
The name of the script or step.
Last duration The time it took virtual users to run the script or step last time, in seconds.
90th percentile duration This value indicates the maximum duration for 90% of scripts or steps, in seconds. For each of these scripts and steps, the actual time may be less than this value, but not greater.
95th percentile duration This value indicates the maximum duration for 95% of scripts or stes, in seconds. For each of these scripts and steps, the actual time may be less than this value, but not greater.
Average duration The average time it took virtual users to perform a script or step on the tested service, in seconds.
Minimum duration The minimum time it took virtual users to perform a script or step on the tested service, in seconds.
Maximum duration The maximum time it took virtual users to perform a script or step on the tested service, in seconds.
Standard duration The standard deviation of durations, in seconds.
Total Iterations The total number of iterations performed by all virtual users.
Total steps The total number of steps performed in the script by all virtual users.
Total failures The total number of errors all virtual users encountered during the test run.
Total assertion failures The number of validation failures that occurred during the run.
Total timeouts The number of timeout errors that occurred during the run.
Total page errors The number of page errors (for example, missing objects) that occurred during the test run.

The Average Navigation Times table shows the average time it took virtual users to navigate between the website’s pages and locate and identify objects there.

Navigation timings

LoadNinja counts the following navigation timings, one after another:

Load test structure

Click the image to enlarge it.

Each timing is counted separately. The total time is the overall duration of the load test.

The table includes the following columns:

Column Description
Script
Step Name
The name of the script or step.
Total time The average time it took to run the load test.
Think time The average time virtual users spent simulating pauses between user actions during the playback.
Redirect time The average time virtual users spent following the redirects until getting to the final URL if the tested website had responded with an HTTP 301 or 302 redirect.
DNS time The average time virtual users spent performing a DNS lookup, that is, obtaining the IP address of the website from a DNS server.
Connect time The average time virtual users spent performing a TCP handshake, that is, establishing a connection to the web server after the DNS lookup.
First byte time The average time virtual users spent waiting for the first byte of the response from the server. Its duration includes processing the requests, accessing the database of the server, selecting and generating a response, and depends on the server performance mostly.
Response time The average time passed from the moment when the client (a virtual user) sent the composed request until the moment the final byte was received.
DOM load time The average time it took to load and construct the DOM. The DOM is considered completed when the DOMContentLoaded event is triggered.
Event time The total time it took to fully load the page, as well as all the resources it refers to, to the Document Object Model (DOM). The page is considered fully loaded when the onload event is triggered.

VU Debugger

The VU Debugger shows the error log and information on errors. Use this tab to identify the source of an issue, view network traces, perform debugging, and so on. For more information, see Debugging.

VU Inspector

The VU inspector shows desktops of remote cloud machines where virtual users are working.

Results: The VU inspector

Click the image to enlarge it.

Each frame contains the virtual user ID, the name of the script it runs, and the URL the user is on.

Loadgen Health

The Loadgen Health tab displays the performance metrics received from LoadNinja servers. It helps you identify cases where the test infrastructure is the cause of errors that appear in test results.

To learn more about the tab, see Loadgen Health Tab.

See Also

Debug Virtual Users
Scripts
Test Results

Highlight search results