Since a test automation strategy consists of automated testware at its core, the automated testware can be extended to record information about its usage. When abstraction is combined with structured testware, all extensions to the underlying testware can be used by all higher level automated test scripts. scripts can be used. For example, the extension of the underlying testware to record the start and end time of execution for one test may well apply to all tests.
Automation features to support measurement and reporting
The scripting languages of many test tools support measurement and reporting through functions that can be used to record and log information before, during, and after test execution of individual tests, test groups, and an entire suite of tests. Tests and an entire test suite.
Reporting on each individual test run must have an analysis capability to include the following consider the results of previous test runs to show trends (e.g., changes in test success rate).
Test automation typically requires automation of both test execution and test review, the latter being accomplished by comparing certain elements of the test result to a predefined expected result. This comparison is generally best performed by a testing tool. The information content reported as the result of this comparison must be considered. It is important that the status of the test is correctly determined (e.g., passed, failed). In case of a failed status, further information about the cause of the failure is required (e.g. screenshots).
It is not always trivial to identify the expected differences between the actual and expected results of a test. not always trivial, but tool support can be very helpful in defining comparisons that ignore expected differences (e.g., ignore dates and times) while highlighting unexpected differences.
Integration with other third-party tools (spreadsheets, XML, documents, databases, reporting tools, etc.)
When information from the execution of automated test cases is used in other tools (for tracking and tracing and reporting, such as updating the traceability matrix), it is possible to provide the information in a format that is appropriate for those third-party tools. This is often achieved through the existing features of the testing tools (export (export formats for reports) or by creating customized reports that are output in a format that is compatible with other programs (“.xls” for Excel, “.doc” for Word, “.html” for the web, etc.).
Visualization of results (dashboards, charts, graphs, etc.)
Test results should be visualized in charts. Consider using colors to indicate test traffic light issues to indicate progress of test execution/automation so that decisions can be made based on the decisions based on the reported information. Management is particularly interested in visual summaries to Management is particularly interested in visual summaries to see the test result at a glance; if more information is needed, they can still dive into the details.