This page will help you solve problems related to Performance issues that occurs during the analysis of the application source code. During this step, the analyzer parses the files and allocates the parse results in memory.
*Observed in CAST AIP
*Observed on RDBMS
Microsoft SQL Server
Step by Step Scenario
Package and deliver the source code.
Set as current version the application.
Run the analysis.
Open the log file, the parsing step is very long between two lines or does not advance.
Check if performance issue is:
During parsing web descriptor files, application descriptor files, environment profiles or property files
Check if all the files present in the source code folder are actually used by the relevant analysis unit. If not, then clean up the unused files by excluding them from the source code delivery.
Check the RDBMS server performance issue. Connect to the RDBMS Server and check if the memory consumption is high and if the CPU usage is almost 100%. In this case update your hardware according to the requirements in Deployment - sizing
If performance issue occurs for a limited amount of files, then as a temporary fix, these files can be excluded from the analysis. This may have an impact on the analyzed objects and/or links created, so this solution should be evaluated for each case.
During parsing of application source files
Parsing of source files for most analyzers has 2 phases. The first phase(or light parsing) normally takes a few seconds, so any parsing time of >15s should be considered a performance issue. The second phase may take some minutes for each file, so any parsing time >2 minutes should be considered a performance issue. Verify in the analysis log if performance issue is caused while parsing a specific number of source code files or if performance issue is linked to parsing of all source code files.
If performance is caused by a limited number of source code files:
Check if the source code files are empty (0 kb) files and exclude them from the analysis
Check if performance issue is caused by source files >10MB and evaluate the impact on analysis by excluding these files.
If the analysis is stuck, the issue can be caused by the last analyzed file for which there is a parsing message or by the current analyzed file for which there is no parsing message:
Remove last analyzed source file from source code and rerun analysis.
If analysis continues past the previous second to last analyzed source file then you have found the culprit file and can exclude it from analysis. You may have to repeat the previous step for a different file
If analysis does not advance past the previous second to last analyzed source file, then there is an issue with a source file for which there is no information on the analysis log. In this case you will have to identify the culprit file(s) by dichotomy.
For specific technologies
Check for any CR/LF sequence present in the source files. During the parsing step, embedded CR or LF sequences are interpreted by the analyzer as a new line. This can affect analysis run time and leads to the performance issue.
In Notepad++ set as search path the source code path and, use the 'extended' search mode and search for \r \n
If such a pattern exists the result in search window will be similar to
Replace all the \r\n with blank(no character)
Check if the tabulation length in the source code file is not equal to 8 characters. As a result the analysis can get stuck.
Open in Notepad++, the source code file where the analysis got stuck and verify the tabulation length of all lines:
Check if analysis is getting stuck during processing of a source code files as shown in the below screenshots:
Check if the processed file is a non-shell files that were renamed to shell by changing the extension. For example an XML file renamed to "filename.xml.ksh". If yes, exclude it from the analysis.
Check if the processed file contains garbage characters, ie any special character that is not on the key board stroke. These files cannot be processed by the pre-processor and thus should be removed/excluded from the analysis. If yes, exclude it from the analysis.
If performance is caused while parsing all source code files:
Check log file size. For log files >1GB, analysis performance can degrade since open/writing on log can take considerable time. A huge log is an indication that there is a problem in the analysis since log may contain many warning messages. In that case go to CMS Snapshot Analysis - Run Analyzer - Warningsand resolve the warnings.
If the Deployment folder is on a remote server, verify the local network speed by copying a large file(1GB) from the remote location to the analysis server and compare the speed to your theoretical LAN speed. As a solution you can set the Deployment folder as a local folder or improve the network infrastructure.
Check the Analysis Server Performance
Check the memory consumption for the AnaRun process. If the AnaRun memory consumption is 4 GB, and there is only 8 GB is the machine, this can explain the performance issue.
Check if the CPU usage has reached 100%.
If yes, it means that other processes are currently running and consuming many resources. Stop the unnecessary processes.
Check if the issue is due to the Inference Engine. Enabling the Inference Engine is an option that can be set for certain technologies as you can read in CMS - Technology editors pages
Check whether the inference engine is enabled. In the log file, check for "Use Inference Engine : 1": If the the option is set to 1, then the Inference Engine (I.E.) is used.
Check if the issue is linked to inference Engine
As a test, disable the Inference Engine by un-checking in CAST Management Studio the Inference Engine Option in the production tab.
During creating links with external objects in 'External Link Processing'
Check if analysis is stuck while executing a stored procedure
Open PGAdmin and select server status in order to observe all the running queries and their duration. Queries that have been active for a long time are marked in Orange. For example, during External Link Processing a running query that can cause performance issue is CACHE_PROCESSID:
Select the query and stop it by clicking in the 'Terminate Backend' red button
In the analysis log file you will obtain the sub-procedure that is was being executed and caused the performance issue: