Cloud Services

DevOps Intelligence

Test module
Published On Nov 21, 2024 - 12:37 PM

Test module

The Test dashboard presents an at-a-glance view of your test environment as measured by status and success.
In the Test dashboard, data can be filtered by two criteria:
  • Dashboards:
    Choose a dashboard to select data specific to applications. If a user has more than one dashboard, there will be more than one option to select.
  • Applications:
    Comprehend the build health of cloud applications such as Cost &Asset Management or Enterprise Marketplace.
Depending on your selections from these filters, the data displayed in the widgets varies. Some widgets may show the message
“No Data Available”
, which means no recent data is available for your selection.
The Test dashboard displays data in five widgets and a table view that allows a view of critical components of the Test phase:
Test phase components
Description
Test summary by application
This graph summarizes tests performed on all configured applications based on the selection chosen. This choice might be based on the number of failed, passed, or skipped tests.
Top 5 technical services with less coverage
This graph depicts the top five technical services with the least coverage during the previous seven days.
Overall test status
The Overall test status graph represents the number of tests passed, failed, and skipped over the selected timeframe, and the data status is described as follows:
  • Failed
    (Red): Tests failed within the selected period.
  • Passed
    (Green): Tests passed within the selected period.
  • Skipped
    (Yellow): Tests executed but not finalized; thus, their final result cannot be determined.
By hovering over the graph, the following data is presented for each test status:
  • Group
    : the test status from the three categories.
  • Date
    : date of the execution of the tests.
  • Value
    : total number of tests executed in a determined time frame.
The Overall test status widget presents two axes that indicate the
Test
within a specified time frame in which Passed, Failed, and Skipped tests occurred:
  • X-Axis (
    Days(Year)
    ): The X-Axis corresponds to the dates within the activity's time frame.
  • Y-Axis (
    Total tests
    ): The Y-Axis corresponds to the number of Passed, Failed, or Skipped Tests for the selected time frame.
You can switch between the overall test statuses graph and bar chart by selecting the donut chart icon or the bar graph icon.
Test type status
The Test Type Status widget is a graph showing
Unit tests
. The widgets display the total numbers for each of the following:
  • Unit tests
    : (Purple) Displays the results of automated tests to ensure that a section of an application meets its design and behaves as intended.
  • Functional tests
    : (Blue) Displays the results of successfully executed Functional Tests.
  • Other tests
    : (Pink) Displays the results of other tests executed.
When Unit tests, Functional tests, or Other tests are checked or unchecked, the
Overall test status
widget will automatically be updated based on the new parameters.
Top 5 technical services with code smells/bugs
Based on a 7-day sample, this graph depicts the top 5 technical services with the most code smells/bugs.
Code Smells
are the default selection; using the drop-down menu, you can view the
Bugs
.

Compare releases page

The Compare Release page enables teams to evaluate how their test results have evolved over time. This comparison focuses on a single application and examines two different releases. From the data served by this page development managers can understand how the team has progressed in building stable code from one release to the next. These trends help managers make data-driven decisions about resource allocation, identify areas needing improvement, and optimize the development workflow.
Understanding these trends is crucial for assessing and mitigating risks, ensuring that the software remains reliable and meets quality standards. The information on these parameters are being captured from SonarQube in this case.
The page offers insights into different releases the following parameters:
  • Bugs
  • Coverage
  • Affected technical services
  • Code smells
  • Failed tests
  • Skipped tests
The duration for comparison is always 180 days; this option is set by default and cannot be changed.
You can Navigate to the Compare Release page from the Test dashboard by clicking the
Compare releases
button. To see the comparison charts and their tabular representation, you must pick at least one application, two releases, and one environment in the Compare Release Dashboard. By default, you would be shown all six parameters of a test run as charts. However, you have the ability to eliminate any chart that does not meet their standards. Please remember that if the selected charts are to become the default, they must be stored as custom views; otherwise, any changes in chart selection will be lost in the next login session.
The "Compare Release" page now includes an "Export to .CSV" functionality, allowing users to easily export comprehensive reports of product, software, or service comparisons over time. This feature enhances the analysis of changes, updates, and improvements between different releases for more efficient decision-making and tracking.
The deleted charts from the
Add components
window can be re-added by dragging and dropping the item. Charts can also be rearranged based on their importance. The table chart concisely summarizes all charts.

Technical service test table

The Technical service tests table is located at the bottom of the Test dashboard, which provides technical service Test data in a tabular form and enables a detailed view of each service. Each row in the table displays information for a specific service, separated by columns of information type:
  • Technical service:
    The name of the micro-service within the larger application.
  • Applications:
    Comprehend the test environment health of IBM cloud applications such as CAM and Enterprise Marketplace.
  • Test type:
    The type of Test.
  • Failed:
    The total number and percentage of Tests in the Failed category or group.
  • Skipped:
    The total number and percentage of Tests in the Skipped category or group.
  • Passed:
    The total number and percentage of Tests in the Passed category or group.
  • Total:
    The total number of Tests.
  • Bugs:
    The total amount of Bugs. (Hidden by default)
  • Code smells:
    The total amount of code smells. (Hidden by default)
  • Coverage:
    The type of coverage. (Hidden by default)
  • Release:
    Release that is associated with the executed tests.
  • Environment:
    The instance of the application where tests were executed.
  • Duration:
    The total time a Test took to be executed.
  • Execution date:
    The date the Test was executed.
  • Tool engine:
    The Test tool source.
The Technical service test table displays all data regardless of the time frame selected. All columns in this table can be sorted. Above this table, you will find a search box that allows searching technical services by name and a
Settings
icon that allows changing the table settings to show or hide pre-selected columns.
The Table view also supports detailed views for each service. Click on the details view for a specific project, DevOps Intelligence displays the project’s performance along with a historical view of scan results and scan duration. Tracking scan durations over time enables a development manager to assess the efficiency of the scanning process.
DevOps Intelligence provides historical trends for individual projects, and analyzing the trends of scan test results in both unit testing and functional testing offers a comprehensive view of the software’s health and the development process. High and consistent pass rates in unit tests indicate stable and reliable code at a granular level. Consistent passing of functional tests ensures that the application complies with industry standards and regulations.
To access details for a specific technical service, select the overflow menu to the far right of the table and select
View details
. Select the following link for more information: The dashboard table view details
Do you have two minutes for a quick survey?
Take Survey