Page tree
Skip to end of metadata
Go to start of metadata

On this page:

Review the Analysis Log and resolve or understand reported issue

To review the Analysis Log, navigate to each analysis execution item and then click the log link:

Review the more common warning / errors as described in the table below.

Validate analysis boundary

The objective is to ensure that what has been analyzed and mapped into the Analysis Services as objects and links is consistent with the intent and scope of the analysis configuration.

For the purpose of this validation, the CAST Admin should try to generate a preliminary CAST Engineering Dashboard (taking a snapshot) as soon as all log issues are resolved. A few cursory checks can help to easily pinpoint issues with the source code discovery tool, or other configuration issues. The CAST Admin should, as a minimum, perform the following comparisons: 

  • Files by technology and Line of Code (LOC) delivered* vs Files by technology and LOC analyzed 
  • Use Execution Unit Report to check the number of artifacts produced vs those delivered and/or expected
  • Perform a cursory check of CAST Engineering Dashboard results - you can do so by launching a temporary CAST Engineering Dashboard from the Application editor > Execute tab:

Remember that checking the CAST Engineering Dashboard results at this stage assumes you have generated a snapshot - if you have chosen to use the Run Analysis Only option as described in 2.2.1. Run the Analysis, then you cannot yet view the results in the CAST Engineering Dashboard.

Viewing Analysis Unit content

To inspect the results of the analysis for each Analysis Unit, you can view the content of the Analysis Unit - see figure below:

This displays the following data you can use to compare to the information gathered during the application qualification and delivery validation steps:

Viewing Execution Unit content

To view the Execution Unit content, right click the Application name > Execute > Analysis > View execution unit:

This will open an execution report that will display the lines of code in the Analysis Service for each Analysis Unit. Compare this with LOC count from tools such as SCLOC to validate the analysis configuration:

Validating the quality of an analysis

The analysis can be validated in a variety of ways:

  • By reviewing the log files generated by the different tools invoked during the analysis
  • By checking if expected objects and links have been correctly saved - this can be achieved using CAST Enlighten

Various tools and several SQL query have been develop and are used by the CAST Admin to check the Analysis Services produced. Few notable tools that could simplify this task include:

  • SCCount  — provides a detailed inventory of the source code delivered for analysis and partial discovery of frameworks 
  • Inventory and run-time reporting tool -  this tool may be suitable for large AI Centers. It reverse engineers the content of an Analysis Service and provides details about number of jobs, run time, etc.

Validate module configuration 

To validate the Module configuration:
  • Navigate to the Modules tab in the Application editor (CAST Management Studio) and open each Module that is listed:

Review the content of the Module and any filters:

Note that these two options require:
  • up-to-date results in the Analysis Service
  • the tables and views used in any Explicit content filters to be up to date (e.g.: if using the CAST System Views (CSV) in the query, please ensure the CSV are up-to-date - remember that the CSVs are only automatically updated when you generate a snapshot - otherwise you can update them manually using the Update CSV tool


Back to: 2.2.2. Validate and fine tune the Analysis

  • No labels