This topic describes viewing SOAtest results. In this section:

Filtering Results

By default, the Quality Tasks view shows cumulative results for all tested resources. For example, if you imported results from Team Server, then ran two tests from the GUI, the Quality Tasks view would show all imported tasks, plus all results from the subsequent two tests.

If you prefer to see only results from the last test session or from selected resources, you can filter results.

  1. Click the filters button in the Quality Tasks view’s toolbar.


  2. Set the desired filter options in the dialog that opens.

Matching Tasks to the Test Configuration

The Quality Tasks view links directly to the Test Configuration that resulted in the task. You can right-click on a task and choose View Test Configuration to review the configuration, which will open directly to the Test Configuration controls related to the task. For instance, if a static analysis task was selected, the Static tab will be opened and the corresponding rule will be highlighted.

This enables you to review or modify the Test Configuration setting, as well as verify the Test Configuration used if the results are imported from a server execution.  

Customizing the Results Display

There are several ways to customize the results display to preferences and needs for reviewing quality tasks and/or peer code review tasks.

Changing the Display Format and Contents

You can change the Quality Tasks view’s format and contents by:

Selecting Layout Templates

There are several available layout templates:

  • SOAtest Change Impact Layout: For reviewing tests that may have been affected by changes to the application under test.
  • SOAtest Default Layout: For functional testing.
  • SOAtest Static Analysis Layout: For running static analysis against source code (e.g., from the Scanning perspective).
  • SOAtest Static Analysis for Functional Tests Layout: For running static analysis by executing a test suite (e.g., a test suite that contains a Browser Testing tool or a Scanning tool).
  • Virtualize Change Impact Layout: This template is available if Virtualize is also installed. It shows tests that may have been affected by changes to the application under test.
  • Virtualize Default Layout: This template is available if Virtualize is also installed. For service virtualization.

To select the layout best suited to your current goal:

  1. Open the pull-down menu on the top  right of the Quality Tasks view.

     

  2. Choose one of the available formats from the Show shortcut menu that opens.

Customizing Layout Templates

To customize one of these preconfigured layouts:

  1. Open the pull-down menu on the top right of the Quality Tasks view.
  2. Choose Configure Contents.
  3. In the dialog that opens, specify how you want that layout configured. Note that Comment shows the comments that were entered upon source control commit.

Adding New Layout Templates

To add a new layout template:

  1. Open the pull-down menu on the top right of the Quality Tasks view.
  2. Choose Configure Contents.
  3. Click the New button on the bottom left of the dialog that opens.
  4. Select (and rename) the added template, then specify how you want that layout configured. Note that Comments shows the comments that were entered upon source control commit.

Changing Categories from the Quality Tasks View

To re-order, hide, and remove categories directly from the Quality Tasks view:

  1. Right-click the item in the Quality Tasks view.
  2. Choose from the available Layout menu options.

Clearing Messages

You might want to clear messages from the Quality Tasks view to help you focus on the findings that you are most interested in. For example, if you are fixing reported errors, you might want to clear each error message as you fix the related error. That way, the Quality Tasks view only displays the error messages for the errors that still need to be fixed.

Messages that you clear will only be removed temporarily. If the same findings are achieved during subsequent tests, the messages will be reported again.

You can clear individual messages, categories of messages represented in the Quality Tasks view, or all reported messages.

Clearing Selected Messages

To clear selected messages shown in the Quality Tasks view:

  1. Select the message(s) or category of messages you want to delete. You can select multiple messages using Shift + left click or Ctrl + left click.
  2. Right-click the message(s) you want to delete, then choose Delete.

The selected messages will be removed from the Quality Tasks view.

Clearing All Messages

To clear all messages found:

  • Click the Delete All icon at the top of the Quality Tasks view.

Generating Reports

This topic explains how to generate HTML, PDF, or custom XSL reports for tests that you run from the GUI or command line.

Sections include:

Understanding Report Categories and Contents

Report categories and contents vary from product to product. For details on the reports generated by a specific Parasoft product, see that product’s user’s guide.

Troubleshooting Issues with Eclipse Report Display

A known bug in older versions of Eclipse may result in a crash when displaying reports. The following workarounds may help you overcome this issue:

  • Update to the latest version of Eclipse.
  • Install a fixed XULRunner plugin to Eclipse.
  • Use an EPF (Eclipse Preferences File) to configure Eclipse to use an external browser; you need to change the  "browser-choice" option—e.g.,: /instance/org.eclipse.ui.browser/browser-choice=1
  • If the problem occurs only at the end of the testing process, disable the Open in browser option in the Report and Publish dialog.

From the GUI

Generating the Report

To generate a report immediately after a test completes:

  1. After the test has completed, click the Generate Report button that is available in the Test Progress panel’s toolbar.



  2. Complete the Report dialog that opens. The Report dialog allows you to specify:
    • Preferences: Report preferences (by clicking the Preferences button and specifying settingsas explained in Configuring Report Settings).

    • Options file: Any localsettings/options that specify reporting settings you want to use. These will override settings specified in the GUI’s Preferences panel).For details on configuring reports through localsettings, see Configuring Localsettings.

    • Report location: The location of the report file. Default is <USER>\AppData\Local\Temp\parasoft\reports (Windows) or <USER>\parasoft\reports (macOS).
    • Open in browser: Whether the file is automatically opened in a browser.
    • Delete on exit: Whether the  report is deleted upon exit.
    • Generate reports: Whether a  report should be created.
    • Publish reports: Whether the report should be uploaded to the Team Server (Server Edition only; requires Team Server).
    • Publish code reviews: Whether code review tasks/results should be uploaded to the Team Server (any edition; requires Team Server).
  3. Click OK to open the report. 

Uploading the Report to Team Server

To upload the report to Team Server (Server Edition only):

  • Follow the above procedure, but be sure to enable Publish: Reports before clicking OK.

How do I aggregate or separate results from multiple test runs?

Team Server uses the following criteria to identify unique reports:

  • Host name
  • User name
  • Session tag
  • Day - each day, only the last test run is used in trend graph

If your team performs multiple cli runs per day—and you want all reports included on Team Server—you need to use a different session tag for each run. You can do this in the Test Configuration’s Common tab (using the Override Session Tag option).


From the Command Line

To generate a report of command line test results, use the -report %REPORT_FILE% option with your cli command. To upload the report to Team Server, also use the -publish option. See Testing from the Command Line Interface - soatestcli for details.

Configuring Detailed Reports

If you want your report to show details about every test that ran, including tests that did not fail, configure reporting preferences with the Only tests that failed and Only top-level test suites options disabled.

Reviewing SOAtest Results

In the Quality Tasks view, SOAtest results are presented as a task list that helps you determine how to proceed in order to ensure the quality of your system.

Functional Testing

Functional test results are organized by test suite. See Reviewing Functional Test Results for details.

Static Analysis

Static analysis results should be reviewed in one of the layouts designed especially for static analysis. For details on enabling these layouts and reviewing static analysis results, see Reviewing Static Analysis Results.

  • No labels