BitBar is a flexible cloud-based mobile and web application testing platform that allows you to perform live (manual) manual app testing or automated testing in any framework against desktop browsers (Windows, macOS, and Linux) and real iOS and Android devices.
When you login, the BitBar Dashboard is displayed, and you can return to this page easily from the left navigation from anywhere in the application.
To perform navigation, following is an example:
From the Dashboard, select Live Testing > Start a Browser Live Test from the left-hand side menu, or use the Start Browser Test button on the Live Browser Testing tile.
The Browser Live Testing page is displayed.
Click Start Test to start a live test or from the Browser Live Testing page, you can click Dashboard to navigate back to the main page.
In the top navigation, you can find the following options:
Turn on and off the SecureTunnel
Adding or inviting other members.
Information about Billings, Usage Details, and Security Center.
BitBar Help documentation.
Account settings, Subscriptions, and Access.
The left side menu contains the navigation links required to perform any actions within the application.
You can return to the dashboard, start automated or live testing sessions, view your projects, view all available devices, and view the number of parallels currently in use by your account with our easy-to-read gauge.
On the dashboard, you will find easy access to what you are most interested in - testing. You can directly navigate to the type of testing that you want to perform: Automation Testing (for both web and mobile apps), Live Mobile App Testing, and Live Browser Testing.
Automation Usage Statistics
When you scroll down the page, you will see the Automation Usage Statistics and data related to the usage and errors of your account:
Users– The total number of users associated with your account. The main user is considered as the first or the primary user and any additional users are added to that.
Devices / OSs– The number of devices used in your projects. This includes devices used by you and sub-users attached to your account. Sub-users can only view devices used by them.
Projects– The total number of projects created for which the test runs are performed by you and the sub-users during the past month. Sub-users can only view their projects.
Test Runs– The total number of tests runs that are performed by any user attached to your account. Sub-users can only view their own projects.
Success Ratio– The cumulative test-case pass ratio from all the test runs done by you and your sub-users for the past month. Sub-users can only view the pass ratio of their own runs.
Automated Test Run Summary
Automated Test run summary involves the following aspects:
Execution Summary (from the last 20 Test Runs)– Displays the information on the performance of the devices selected for the test runs. Some devices may be excluded from the run depending on the test case, for example, because their API level is too old, or a processor is too slow.
Project Test Case Success Ratio– Displays the overall quality of the test executions per project. Success ratio includes sub-user test runs; however, the sub-users can only view the test runs from their own projects.
Last 10 Automated Test Runs– Displays the test and device success execution ratios. Each test run shows its state and duration of the test run. This list shows how well your tests are progressing and performing on the various devices you are testing on. It indicates the most used devices or the OS platform in which the test was successful. It also allows you to view when the test run status is good, however, the test fails due to bad device success ratio. Similarly, if device execution is good but test success ratio is bad, there can be issues with the quality of your tests, and this should be investigated and fixed.
Usage Statistics – Automation
The Usage Statistics Automation table shows the test run performance by the device:
Device / OSs– The device name or the OS platform in which the test is run.
Usage (h)– The duration that the device is used for the test.
Usage (count)– The number of times the device is used for the test.
Success Ratio– The overall device pass ratio from all the test runs done by you and your sub-users. Sub-users can only view the pass ratio of their own runs.
Failures– The number of times the device failed.
To download the data, click Download CSV.
Device OS Usage Statistics
The Usage Statistics Automation table shows the test run performance by the Operating System:
Device / OSs– The Operating System name and the version on which the test is run.
Usage (h)– The duration that the OS is used for the test.
Usage (count)– The number of times the OS is used for the test.
Success Ratio– The overall OS pass ratio from all the test runs done by you and your sub-users. Sub-users can only view the pass ratio of their own runs.
Failures– The number of times the device with the corresponding OS failed.