Submit and vote on feature ideas.

Welcome to the new Parasoft forums! We hope you will enjoy the site and try out some of the new features, like sharing an idea you may have for one of our products or following a category.

Parasoft Static Analysis Jobs are Inconsistent

Options
choner22
choner22 Posts: 15

Hello,

I am running Static Analysis across an entire workspace worth of projects with some key characteristics:

Eclipse: Kepler
Parasoft: 10.3.4
Number of Projects: ~160
Number of Files: ~ 1000 .c/.cpp and ~1000.h
Configuration Properties:
1. Limit is not set on number of violations per rule
2. Check box for "static/Flow Analysis Advanced Settings/Compact Incremental Caches " Checked
3. Depth of analysis: Standard
4. Timeouts are off
5.Enable Swapping of analysis data to a disk

The following are facts about the configuration of my projects:

  1. Every project is created from an identical base project and is capable of having Parasoft run across it
  2. Every file exists in only one project
  3. Files are virtually linked into each project
  4. Runs are completed by, within the Eclipse GUI, selecting all projects, right clicking, and selecting the configuration to run
  5. Every run is initiated identically
  6. I have a script that has been verified to accurately distill the information found in the report.xml generated by the static analysis run into a deterministic easily read text file

When I run static analysis using the configurations/processes listed above, I see inconsistency in the static analysis results.

When I run 3 identical runs consecutively on the same workspace, on the same computer, with the same test configuration, and then compare the results, inconsistencies are found, meaning that each run has violations that were and were not reported on the other runs. These violations should be expected to be consistent, should they not? There are certain violations we know should exist and be caught by the tool, however, those violations do not appear on every run.

Is there any reason, or something that I missed during these runs, that would cause the differences in results?

Comments

  • Andrey Madan
    Andrey Madan Posts: 388 ✭✭✭
    Options

    Do you see any setup problems in the report? Is the time of analysis different? I suggest collecting the logs for two separate runs where results are different and submitting a support ticket.

  • choner22
    choner22 Posts: 15
    Options

    There are no setup problems in the report. Everything seemed to process normally. Due to the scope of the file run, it can take anywhere from 1.5 to 2.5 hours and is hard to accurately determine a difference in run time. I will submit a support ticket.