Panel | |
---|---|
On this page:
|
Info |
---|
Summary: During "normal" re-analysis of an application this entire step should be skipped unless as part of the delivery validation significant differences in the delivered source code trigger any new technical qualification of the application. In case of re-analysis, the approach is often reactive and warnings and/or errors reported in the analysis log are what trigger the critical review and fine tuning of the analysis configuration. |
The delivery process via the CAST Delivery Manager Tool streamlines the analysis set-up by suggesting a default analysis configuration when the delivery is validated and accepted. There are therefore two scenarios to consider:
- If there were no defects (i.e. alerts and/or messages) before the delivery was validated and accepted, then the suggested analysis configuration should be ready to go.
- If there were defects present when the delivery was validated and accepted or a CAST AIP extension that does not include a specific "project discoverer" is being used, then the CAST AI Admin will need to manually build the Analysis Unit and related analysis settings.
Scenario 1 - no defects in the delivery
In this scenario, the analysis configuration should be ready to go without further configuration and you proceed direct to Run and validate the analysis. If necessary, however, you can check the automatic configuration as follows:
- Review Technology and Dependency settings - Review Analysis Units and the configuration of the application specific technology stacks.
- Run the analysis in test mode and inspect analysis log - Running the analysis leveraging the proposed configuration is best way to confirm the analysis setting, Configuration changes should be considered only to fix any error or warning reported in the analysis log
Scenario 2 - manual analysis configuration
If there were defects (alerts and/or messages) present when the delivery was validated and accepted or a CAST AIP extension that does not include a specific "project discoverer" is being used, then the CAST AI Admin may need to manually configure some aspects of the analysis - i.e. some Analysis Units may be missing.
Info |
---|
If new problems arise in this step that require a new delivery, the CAST AI Admin can use the CAST Management Studio to reject the delivery, thus resetting the workflow to the source code delivery step. |
Manually Create Analysis Units
To create your own Analysis Unit in the CAST Management Studio:
- In the Application editor, click the Current Version tab
- Select the Deployed package which has no corresponding Analysis Unit (1)
- Click the + button to add a new Analysis Unit (2)
- Select the type of Analysis Unit you want to create - this must correspond to source code of your technology type (3) - generally for CAST AIP extensions you should choose the "Add new Universal Analysis Unit" option.
- The selected Analysis Unit editor will then open enabling you to define the Analysis Unit. For example, here we are creating an Analysis Unit for SQL technologies that will be analyzed using the SQL Analyzer extension:
- In the Source Settings tab (1)
- Tick the Universal Language technology type (2) if this Analysis Unit is a Universal Analyzer Analysis Unit
- Click the Add Source Folder button (3)
- Select the root source folder for your project. By default the CAST Management Studio will select the location of the deployed package in the Delivery folder:
- Finally, make any other configuration changes you require in the Source Settings tab, Analysis tab and Production tab where applicable.
- Your Analysis Unit is now defined:
You should now proceed to validate other settings, as shown below:
- Review Technology and Dependency settings - Review Analysis Units and the configuration of the application specific technology stacks.
- Run the analysis in test mode and inspect analysis log - Running the analysis leveraging the proposed configuration is best way to confirm the analysis setting, Configuration changes should be considered only to fix any error or warning reported in the analysis log
Finally you can proceed direct to Run and validate the analysis.