This documentation is no longer maintained and may contain obsolete information. You should instead refer to Application onboarding.
On this page:
Introduction
This step should always be executed as part of the first time analysis of an application performed during the on-boarding with CAST AIP. When re-analyzing an application, this step is considered optional and it is not always required unless the differences between subsequent source code deliveries highlight significant changes in the application that require partial or full re-onboarding of the application.
It is considered good practice to routinely re-validate the configuration of an application anyway. The frequency of such a quality check depends on the frequency of reanalysis and may be triggered but key events linked to the IT configuration change management process (e.g. refactoring, consolidations, etc.)
Once the analysis is complete, the generated log files need to be reviewed to verify that no error or warning which may impact the quality of the results have been detected. For each of the technologies supported by CAST, the log manager will produce a set of statistics to verify the analysis and compare this to the previous analysis. Furthermore, the Dynamic Links need to be verified as well as the display in the CAST Engineering Dashboard. For both there are tools that are supportive for the final technical validation.
Review the Analysis Log and resolve or understand reported issue
To review the Analysis Log, navigate to each analysis execution item and then click the log link:
Review the more common warning / errors as described in the table below:
Technology | References |
---|---|
C# and VB.NET | .NET - Analysis messages |
J2EE | JEE - Analysis messages |
Mainframe | Mainframe - Analysis messages |
C / C++ / Pro*C | C and Cpp - Analysis messages |
SAP ABAP | SAP ABAP - Analysis messages |
In addition, you can view all Analysis logs using the following option in the CAST Management Studio, Application editor:
Validate analysis boundary
The objective is to ensure that what has been analyzed and mapped into the Analysis Services as objects and links is consistent with the intent and scope of the analysis configuration.
For the purpose of this validation, the CAST Admin should try to generate a preliminary CAST Engineering Dashboard (taking a snapshot) as soon as all log issues are resolved. A few cursory checks can help to easily pinpoint issues with the source code discovery tool, or other configuration issues. The CAST Admin should, as a minimum, perform the following comparisons:
- Files by technology and Line of Code (LOC) delivered* vs Files by technology and LOC analyzed
- Use Execution Unit Report to check the number of artifacts produced vs those delivered and/or expected
- Perform a cursory check of CAST Engineering Dashboard results - you can do so by launching a temporary CAST Engineering Dashboard from the Application editor > Execute tab:
Viewing Analysis Unit content
To inspect the results of the analysis for each Analysis Unit, you can view the content of the Analysis Unit - see figure below:
This displays the following data you can use to compare to the information gathered during the application qualification and delivery validation steps:
Viewing Execution Unit content
To view the Execution Unit content, right click the Application name > Execute > Analysis > View execution unit:
This will open an execution report that will display the lines of code in the Analysis Service for each Analysis Unit. Compare this with LOC count from tools such as SCLOC to validate the analysis configuration:
Click to enlarge
Validating the quality of an analysis
The analysis can be validated in a variety of ways:
- By reviewing the log files generated by the different tools invoked during the analysis
- By checking if expected objects and links have been correctly saved - this can be achieved using CAST Enlighten
Various tools and several SQL query have been develop and are used by the CAST Admin to check the Analysis Services produced. Few notable tools that could simplify this task include:
- SCCount — provides a detailed inventory of the source code delivered for analysis and partial discovery of frameworks
- Inventory and run-time reporting tool - this tool may be suitable for large AI Centers. It reverse engineers the content of an Analysis Service and provides details about number of jobs, run time, etc.
Validate module configuration
- Navigate to the Modules tab in the Application editor (CAST Management Studio) and open each Module that is listed:
Review the content of the Module and any filters:
- up-to-date results in the Analysis Service
- the tables and views used in any Explicit content filters to be up to date (e.g.: if using the CAST System Views (CSV) in the query, please ensure the CSV are up-to-date - remember that the CSVs are only automatically updated when you generate a snapshot - otherwise you can update them manually using the Update CSV tool