Purpose (problem description)
This page explain how to change the inference engine parameter.
Details
Observed in CAST AIP
Release
Yes/No
8.3.x(tick)
8.2.x(tick)
Deactivate Inference Engine
Observed on RDBMS
RDBMS
Yes/No
Oracle Server(tick)
Microsoft SQL Server(tick)
CSS3(tick)
CSS2(tick)


To deactivate the Inference Engine (IE). In 7.2 / 7.3/ 8.0, it is possible to switch off the Inference Engine from CAST-MS by proceeding as below :

  • You must be in expert mode

  • In the Production tab or in Technologie tab, select the technology and expand the process settings section.

  • To deactivate the Inference Engine, uncheck the "Use Inference Engine" field:


Modify Inference Engine Settings

In the same Production tab than before, at the Inference Engine section, you can update the technical settings available for the technology. The settings are not the same from one technology to another, for example:

  • .NET

 

  •  J2EE


  • MainFrame

 



Variables to set when facing performance issues

Variables settings for J2EE and .NET:

  • String concatenation : This value limits the number of strings that will be found during the search of each object value. Note that limiting the number of strings can lead to incomplete results, however, performance is improved.
  • Procedure Call Depth : This value limits the number of intermediate values that the Inference Engine must resolve in order to obtain the value of the object that is being searched for. Note that limiting the number of intermediate values can lead to incomplete results, however, performance is improved. The lowest value you can enter is 1.
  • Local Procedure Complexity:  When the Inference Engine is active, this value limits searches of large methods that have a high Cyclomatic Complexity level.

Please see the documentation for more information.

For J2EE on 8.1, refer to:  CMS - J2EE Technology options

For J2EE on 8.0, refer to:  CMS - J2EE Technology options

For .NET on 8.1, refer to:  CMS - .NET Technology options

For .NET on 8.0, refer to:  CMS - .NET Technology options

 

Variable Settings For Mainframe:

See the documentation for more information on these options for mainframe. 

For CAST 8.1, refer to: Mainframe - Confirm analysis configuration

 For CAST 8.0, refer to:  Mainframe - Confirm analysis configuration


Impact of reducing the inference engine parameters

Impact of reducing the inference engine parameters

We measure the impact by calculating the number of links we miss.
If the analysis ends successfully, execute the below queries to check the resulting links in the KB and keep the results:

Counting the number of links generated during the analysis by the computation of the Inference Engine
Select count(distinct o.IDOBJ) from OBJDSC o, ObjPro p, anapro an where
o.INFTYP = 110 and o.INFSUBTYP = 9 and o.IdOBJ=p.IDOBJ and p.idpro= an.IDPRO and an.PRONAM like'%JOB_NAME%'

Counting the number of links generated by the analyzer by the computation of the virtualization
Select count ( * ) from ACC a,
(Select IDKEY from KEYS where OBJTYP in ( 102, 988, 977 ) ) JavaMethods , ObjPro p, anapro an
Where a.IDCLE = JavaMethods.IdKey and
a.IDCLE=p.IDOBJ and p.idpro= an.IDPRO and an.PRONAM like'% JOB_NAME %'

Once this is done, update the parameters to LimitSubTarget=300000, LimitString=9000 then run again the analysis. Afterwards, execute the queries and compare the values with the previous. This comparison will let you know if you have more links saved. If it is not the case, then the issue only occurs because there are more trials in resolution but less results found and therefore, there will be no impact in the analysis results using of LimitSubTarget=300000, LimitString=8000 instead of default LimitSubTarget=300000, LimitString=10000.

If the queries results are different, then this shows that there are some links missed when lowering the Inference Engine parameters values.