Release Updates

  • Added feature to stop current executing job while workflow (analysis, snapshot etc. ) is in progress. Please note this feature is available from AIP 8.3.6 and above.
  • Added column in Jenkins job configuration to include Snapshot Capture Date during rescan.
  • Added drive mapping step in user guide document.
  • Clean up of unused parameters from CastBatchWebServices properties file.
  • Added step for configuring AAD plugin in user guide document.

Known Issues

  • None

CAST AIP Automation Solution


AIP Automation Solution is based on the Jenkins Continuous Integration system and supports automation which has been tested against AIP versions specified in the Automation Tooling#FieldToolingCompatibilityTesting . Each application is automated individually through its own Jenkins job, and consists of eleven configurable tasks to complete CAST analysis process from delivery to database optimization.

  • Backup CAST Application Database Trio before ReScan
  • Execute Batch Script
  • Deliver Application Code
  • Accept Delivery
  • Perform the application analysis
  • Generate the snapshot
  • Validate the snapshot results
  • Publish snapshot results to the Application Analysis Dashboard (AAD)
  • Database Optimization
  • Archive Delivery
  • Backup CAST Application Database Trio after ReScan



The Jenkins system exposes a single job for each automated application. These jobs can then be configured to run on a predefined schedule via Jenkins Scheduler, any Windows-based Scheduler, manually through the Jenkins web interface, or the CAST Application Operations Console (AOP). The automation system can be configured to run the entire process completely hands off or with some human intervention. In the event of a failure the Jenkins job can be restarted from any of the steps listed above. This feature is currently being leveraged by AOP.


NOTE: JAF works with a CAST AIP triplet which maintains only one application.



Execute Batch

Depending on the use cases, user may enable the check point which will allow to run a batch script within the workflow.

This script may also be used for:

  • organize backups storage & cleanup.
  • add any additional customized checks before starting the analysis.

The corresponding batch file to be created and placed in the CBWS installation directory or any suitable location and the corresponding name and fullpath to be updated in json file - batches.json. This json file is already present in CBWS installation directory.

User should follow the same pattern to add any new batch script corresponding to any Jenkins job.

i.e For Jenkins job – Mproject

Note: Jenkins job name must match with batch name mentioned in json file. i.e For jenkins job Mproject, corresponding added name must be Mproject.bat.

Code Extraction

The code extraction process is performed using both Jenkins and the CAST Delivery Management Tool (DMT).  Jenkins is responsible for extracting the code from the Software Configuration Management System (SCM) to the Jenkins workspace.  DMT is then configured to retrieve the code, from the file system, and deliver it to CAST. 

Code Delivery

There are two parts to the code delivery process, the baseline and cloning process.  The baseline
is a onetime per application step, performed as the first step in the CAST automation process.  It is used to configure DMT to properly collect the code.  The project source code is collected from either the Jenkins workspace folder or directly from the SCM system. Only after the baseline has been successfully delivered and deployed can the delivery process be automated.  

Once the application has been successfully baselined it can then be completely automated using the Jenkins CI system.  For each code delivery Jenkins will clone the baseline delivery, package, and deliver it to CMS.  It is recommended that a single delivery be designated and maintained as the baseline delivery.  If the application project structure is modified the DMT baseline delivery should also be modified. 
Once the code has been delivered a check is performed to determine if any source code changes have been made. In the event this test fails, meaning no changes have been made to the current delivery, the job is terminated with a -10000 error.  A flag is available in Cast Batch Web Services properties file to turn this feature off.  

Restart
In the event there is an error during the code delivery step it can be restarted using a "START_AT" parameter setting of 2. This will instruct Jenkins to skip the backups and go directly to the code delivery step. If the delivery validation check has failed the user can choose to continue the job from the next step, "Accept Delivery" by using a code of 3. This option should only be used after the code has been physically delivered and the delivery report has failed.
Warning: The "Accept Delivery" step will fail if DMT has not successfully completed delivery process.

Source Code Management (SCM) System

Extract Mechanism

Description

Subversion (SVN)

DMT

Source code retrieval and delivery are done via the DMT. 

Team Foundation Server (TFS)

DMT

Source code retrieval and delivery are done via the DMT. 

LUW - UDB

DMT

The database structure retrieval and delivery are done via the DMT.

Concurrent Versions System (CVS)

Jenkins

Available as part of the Jenkins installation.  User ID and password to be provided by client after installation.

DB2 Mainframe

Jenkins

The sub-job is responsible for running the DB2 Extractor and retrieving the package into the Jenkins workspace. 

GIT

Jenkins

Utilizes a Jenkins plugin.  User ID and password to be provided by client after installation.

Mainframe

Jenkins

The sub-job is responsible for running the Mainframe Extractor and retrieving the package into the Jenkins workspace.

Serena PVCS

Jenkins

PVCS is supported through the use of an ANT script.

Perforce

Jenkins

Utilizes a Jenkins plugin.  User ID and password to be provided by client after installation.

Analysis and Snapshot

Once the code has been delivered the next step is to run the analysis and snapshot processes. This involves deploying the code on the CAST analysis server, by accepting the delivery, setting it as the current version, analyzing it and finally generating a snapshot.
During the Jenkins Job configuration there is an option to set a snapshot retention policy.  When this feature is enabled, there will be only one snapshot retained for each month.  Before the new snapshot is generated the system will check if the previous snapshot was generated in the current month and the total number of snapshots is at least two.  If it is, it is deleted before the new one is generated.    This will limit the number of snapshots to a maximum of 12 per year.   
Restart
An error during the "Analysis and Snapshot" phase can occur during the delivery acceptance, analysis or snapshot steps. If the error occurred during the acceptance step, it is most likely caused by a Jenkins configuration issue, such as an invalid connection profile.  In this situation the Jenkins configuration should be corrected, and the job restarted at the "Accept Delivery" step using a start at code of 3.  An error during the analysis or snapshot steps can indicate that there is a problem with the code delivery or CMS configuration.  Depending on what caused the issue the job can be restated at the "Code Delivery" step using a start at code of 2, or from either the "Analysis" or "Snapshot" steps using codes 4 or 5. Although it is not recommended, the job can also be continue from the snapshot validation step using start at code 6. 

Snapshot Validation

Snapshot Validation is used to validate the data produced during the analysis and snapshot generation step.  For this process to work the Validation Portal must be installed and accessible to Jenkins.   Validation consists of a series of tests, checking the various components of the snapshot and returning a go/no go to Jenkins.  If any of these tests returns a "no go" the job will stop allowing the system administrator a chance to check the severity of the error.
Restart
In the event of a failure, it is recommended that the user correct the issues and restart the job. Once corrections have been made it is not necessary to restart the job from the beginning.  Instead it can be restarted from deliver using a start at code of 2, analysis using a code of 4 or analysis using a code of 5.  If the system administrator determines it is OK to ignore the validation check, the job can be restarted from the next step, publication, using code 7.

Snapshot Publication

Once the snapshot is complete the results should be published to the Application Analytics Dashboard.  The publication option first removes all snapshots for the application then publish all snapshots found in the applications central database to the measurement database.    

CAST Database Optimization 

This is the last step in the process and is used to reduce the size of the CAST database.  This is accomplished by first backing up the applications management local and central schemas, then restoring them. This will have the effect of performing a database vacuum and re-index of all tables, which will reduce the size and increase the overall database performance.    

CAST Automation Process Configuration

Once the Jenkins and CBWS installation is complete, the next step in the process is to setup the Jenkins job that will be used to run the automated re-scan. The job configuration includes details about the source delivery and analysis servers, and the database schema connection profile used to perform the re-scan. Once configured, the job can then be cloned to create jobs for all remaining applications.

Delivery Management Considerations

SCM Systems - File System Vs. DMT Plugin

When delivering code directly from the file system, the CAST user/admin is responsible for maintaining the code versions. This can result in manual copy errors, file path renaming issues and other major problems later on. Using the CAST DMT plugin takes away the hassle of having to manage the code manually, as it maintains all source versions delivered through the tool. By incorporating the DMT plugin in the automated code delivery process, one can mitigate the code integrity risks.
Source code for most technologies supported by CAST can be delivered using the DMT plugin. Before setting up automated extraction of code using the DMT plugin command line, ensure that the first version (v1) of the application has been manually created and is available. Always pick up code from a neutral folder path (i.e. not bearing a version number in the file path) to avoid getting issues of added / deleted code. Clone a new.

Mainframe

Normally mainframe code is delivered using a series of JCL scripts and placed in a punch card format. When automating this code it is assumed that the JCL is run, and the code is in place, prior to running the Jenkins automation. For more information on this process see below section Appendix A: Automating the Delivery of Mainframe Code to CAST AIP with JCL jobs and CAST DMT.

Jenkins Job Configuration

Creating a new 'Freestyle' Job

To create a new job, navigate to the Jenkins home page and follow the following steps:

  1. Click New Item on the left hand menu.
  2. Type a job name in the "Item Name" column - this should typically correspond to the application name, but you can use any name
  3. Select Freestyle project and press "OK".


 Application Job Configuration

On this page, start by adding a new "CAST AIP 0: Configuration" section to the job and follow through the following instructions and shown in the figure below:

  • The Delivery Web Service Address is the location where the CAST Batch Web Service has been registered. The default port number is 9898 but this can be changed in the CASTAIPWS.properties file. The Analysis Web Service should be populated if the analysis is configured to run on a different machine. Click "Test Connection" to ensure that the service can ping the web service addresses (see section  Load balancing on how load is balanced on different analysis web service addresses).
  • The webservice address has always this format: http://<machine_where_CBWS_installed>:<port>/CastBatchWebService
  • This drop down populates all the applications it finds configured in the AIC Portal pointed to in the properties file.
  • Select the correct CMS Connection Profile to use, in order to connect to the service.
  • Select a baseline reference version to use from which the current version will be cloned. Leave the Version Name to the default "Version [TODAY]" value. This will default the Version to today's timestamp.
  • Use the application code name as the Schema prefix. This will be used for backup and snapshot publications.
  • Select the Measurement Schema to upload the results to.
  • Enter Snapshot Capture date in the format: YYYY/MM/DD HH:MM i.e 2019/07/26 14:04 (24 hour format). Please note this is an optional field. If not entered JAF will take system date and time for snapshot date while taking snapshot.
  • Stop the build in case of Error is set to Yes
  • The section to be only used when a particular process i.e analysis or snapshot etc.is taking long time and AIA admin has decided to kill the current process. (see section  Stop workflow for more details).
  • Check all the sections as shown in the image to the right are filled correctly. Remember to only check the "Use JNLP version of DMT?" if the deliver folder is at a remote location.
  • Press Save and go back to the top of the page. Then, select the job and test run it.





If you have installed CastAAD as well, select a new build step as "Cast AAD - Refresh" in the add build section of the job.

Add AAD url and credentials and save them. This will refresh AAD upon job completion.

Load balancing

The Delivery Web Service Address can be same or different than Analysis Web Service addresses. So, this brings up the following three cases.

  • Only delivery web service address is populated, and analysis web service addresses are blank. The system will copy delivery web service address to first analysis web service addresses. In this case the analysis workflow from backup until archive delivery will happen on the delivery web service address itself (since they are both same).
  • Delivery web service address is populated, and only one analysis web service address is populated. For all applications the first three analysis workflow viz backup, delivery and accept delivery will occur on delivery web service address and the rest workflow i.e. from run Analysis through archive delivery will occur in the analysis server.
  • Delivery web service address is populated, and more than one analysis web service addresses are populated. In this case also for all applications first three analysis workflows viz backup, delivery and accept delivery will happen on delivery web service address but rest workflow i.e. from run Analysis through archive delivery will occur selectively on the least loaded server. The system will decide the least loaded server based on the number of application it is currently running. The system maintains a counter on the Jenkins plugin side for each corresponding analysis server and will increase the count if an analysis has started on it. Similarly, system will decrease the count once the job gets complete or gets error out for any reason. So, based on the least count (number of concurrent running analysis), system decides the least loaded analysis server. In cases where all the analysis web server has same load, first one is chosen. Once least loaded analysis web service address is allocated for an application, all the workflow i.e. from run analysis through archive delivery will occur on that particular analysis web server.

Stop workflow

There are instances when a Jenkins workflow i.e analysis or snapshot etc. is going on from long time and AIA admin want to stop it. This section will provide them option to forcefully stop the current process gracefully.

Please note this feature is only available from AIP version 8.3.6 and above.

Automating the Delivery of Mainframe Code to CAST AIP with JCL jobs and CAST DMT

To allow for the CAST Delivery Management Tool (DMT) package mainframe code, it must first be placed into the application delivery folder. The files can either be placed into a folder structure or combined into PDS dump files. This document will focus on the process related to PDS dump files. For further information related to the folder structure methodology, and other mainframe related information, please refer to the Mainframe - application qualification specifics page in the AIP Documentation at http://doc.castsoftware.com/.
Common Mainframe Files
Copybooks are files included in a COBOL program and can be used across multiple applications.   The collection process for copybooks is often inaccurate.  Collecting all copybooks into a single delivery, then referring to them in the Working Folder section of the Analysis tab in CAST Management Studio (C-MS) can make the delivery process more accurate.  It also serves to produce more accurate analysis results since there are never any missing copybooks.

For the purposes of the CAST analysis, COBOL Copybooks, are captured, delivered and deployed to the CAST system as a whole into a "fake" application.  When the analysis is performed for the individual application C-MS refers to the common copybook delivery location. 

Code Preparation
There are three JCL jobs available to assist in the mainframe COBOL code preparation and delivery process.  They should be modified for use with each mainframe file type, JCL, PROC, and COBOL source. 

  • IEBCOPY

The first job to use is the IEBCOPY job.  This job basically copies all of your applications from BISG to your own library for staging.  Make sure to change the MYAPP symbolic to your application name (it can be anything you want to call it, but you'll need to use it for the next two jobs too).  Also, make sure to update IEBCOPY SELECT MEMBER statement to include your applications.  It can use wildcards like * or %.  For example, SELECT MEMBER=BRXA%%%Z, it will copy all members with that name where the %%% is any characters.


//&SYSUID1 JOB (BT00,0000),'IEBCOPY',CLASS=E,MSGCLASS=X,
// NOTIFY=&SYSUID
//*
//* COPY YOUR APPLICATION MEMBERS FROM BISG LIBRARIES TO YOUR OWN PDS
//*
//* 1. SET MYAPP PARM TO YOUR APPLICATION NAME
//* 2. VERIFY BISG LIBRARIES
//* 3. SELECT MEMBER NAMES - WILDCARDS ACCEPTED
//*
//SETPARMS SET MYAPP=APPNAME,
// AISGJCL='AISG.JCLLIB',
// AISGPRC='AISG.PROCLIB',
// AISGSRC='AISG.SRCLIB',
// AISGCPY='AISG.COPYLIB'
//*
//STEP1 EXEC PGM=IEBCOPY,REGION=8M
//SYSPRINT DD SYSOUT=*
//IN1 DD DISP=SHR,DSN=&BISGJCL
//OUT1 DD DISP=(,CATLG,DELETE),
// DSN=&SYSUID..&MYAPP..&BISGJCL,
// SPACE=(CYL,(2,2,100)),
// LIKE=&BISGJCL,UNIT=TEST
//IN2 DD DISP=SHR,DSN=&BISGPRC
//OUT2 DD DISP=(,CATLG,DELETE),
// DSN=&SYSUID..&MYAPP..&BISGPRC,
// SPACE=(CYL,(2,2,100)),
// LIKE=&BISGPRC,UNIT=TEST
//IN3 DD DISP=SHR,DSN=&BISGSRC
//OUT3 DD DISP=(,CATLG,DELETE),
// DSN=&SYSUID..&MYAPP..&BISGSRC,
// SPACE=(CYL,(2,2,100)),
// LIKE=&BISGSRC,UNIT=TEST
//IN4 DD DISP=SHR,DSN=&BISGCPY
//OUT4 DD DISP=(,CATLG,DELETE),
// DSN=&SYSUID..&MYAPP..&BISGCPY,
// SPACE=(CYL,(2,2,100)),
// LIKE=&BISGCPY,UNIT=TEST
//SYSUT3 DD SPACE=(CYL,(100,50),RLSE),UNIT=SYSDA
//SYSUT4 DD SPACE=(CYL,(100,50),RLSE),UNIT=SYSDA
//SYSIN DD *
COPY I=((IN1,R)),O=OUT1
SELECT MEMBER=DAO*
COPY I=((IN2,R)),O=OUT2
SELECT MEMBER=DAO*
COPY I=((IN3,R)),O=OUT3

  • IEBPTPCH


The second job is IEBPTPCH.  This job takes the staging libraries and flattens it to a single file.  Update the MYAPP symbolic to what you used in the IEBCOPY job.

//&SYSUID2 JOB (BT00,0000),'IEBPTPCH',CLASS=E,MSGCLASS=X,
// NOTIFY=&SYSUID
//*
//* CREATE FLAT FILES FROM PDS MEMBERS
//*
//* 1. SET MYAPP PARM TO YOUR APPLICATION NAME
//*
//MYAPP SET MYAPP=APPNAME
//*
//* BEGIN INSTREAM PROCEDURE
//*
//PUNCH PROC PDSIN=
//*
//STEP001 EXEC PGM=IEBPTPCH /* PUNCH PDS MEMBERS */
//SYSPRINT DD SYSOUT=*
//SYSUT1 DD DSN=&PDSIN,DISP=SHR
//SYSUT2 DD DSN=&&LIST1,SPACE=(CYL,(40,20),RLSE),UNIT=SYSDA,
// DISP=(NEW,PASS),
// DCB=(BLKSIZE=0,LRECL=81,RECFM=FB)
//SYSIN DD *
PUNCH TYPORG=PO,MAXFLDS=80,CNTRL=1
RECORD FIELD=(80)
/*
//*
//STEP002 EXEC PGM=SORT /* REMOVE BLANK RECORDS AND CRLF */
//SYSOUT DD SYSOUT=*
//SYSPRINT DD SYSOUT=*
//SORTIN DD DSN=&&LIST1,DISP=(OLD,DELETE)
//SORTOUT DD DSN=&PDSIN..PUNCH,
// SPACE=(CYL,(40,20),RLSE),
// DCB=(BLKSIZE=0,LRECL=80,RECFM=FB),
// DISP=(NEW,CATLG),UNIT=TEST
//SYSIN DD *
SORT FIELDS=COPY
OMIT COND=(1,20,CH,EQ,C' ')
OUTREC FIELDS=(2,80)
/*
//*
// PEND
//*
//* END INSTREAM PROCEDURE
//*
//JCL EXEC PUNCH,PDSIN=&SYSUID..&MYAPP..BISG.JCLLIB
//PRC EXEC PUNCH,PDSIN=&SYSUID..&MYAPP..BISG.PROCLIB
//SRC EXEC PUNCH,PDSIN=&SYSUID..&MYAPP..BISG.SRCLIB
//CPY EXEC PUNCH,PDSIN=&SYSUID..&MYAPP..BISG.COPYLIB
//*

  • FTP


The last job is FTP. This job transfers the file to the drop site.  Update the MYAPP symbolic and update the DROPDIR with what was given to you in the email from your CAST front office specialist. 
//&SYSUID3 JOB (BT00,0000),'FTP',CLASS=E,MSGCLASS=X,
// NOTIFY=&SYSUID
//*
//* FTP FILES TO CAST SERVER
//*
//* 1. change the MYAPP value to your application name
//* 2. change the DROPDIR value to your DROP SITE Directory
//*
//MYSYMEXP EXPORT SYMLIST=(MYAPP,DROPDIR,FSERV,FUSER,FPASS)
//SETPARMS SET MYAPP=APPNAME, * NAME OF YOUR APP
// DROPDIR=TEST, * DROP SITE DIRECTORY
// FSERV='10.xx.xx.xx', * FTP SERVER NAME
// FUSER='anonymous', * FTP USER NAME
// FPASS=&SYSUID * FTP PASSWORD
//*
//FTP EXEC PGM=FTP,PARM='(EXIT TI 240',REGION=32M
//DINPUT DD SYSOUT=*
//INPUT DD DATA,DLM='~~',SYMBOLS=(JCLONLY,DINPUT)
&FSERV
&FUSER
&FPASS.@mycompany.com
TYPE A
CD &DROPDIR
pwd
LOCSITE TRAILINGBLANKS
MPUT &MYAPP..BISG.JCLLIB.PUNCH
LOCSITE TRAILINGBLANKS
MPUT &MYAPP..BISG.PROCLIB.PUNCH
LOCSITE TRAILINGBLANKS
MPUT &MYAPP..BISG.SRCLIB.PUNCH
LOCSITE TRAILINGBLANKS
MPUT &MYAPP..BISG.COPYLIB.PUNCH
QUIT
~~
//OUTPUT DD SYSOUT=*
//SYSPRINT DD SYSOUT=*
//SYSUDUMP DD SYSOUT=*
//*

NOTE: The DROPDIR symbolic should only contain the actual drop site folder name, and not the full path.

Code Delivery
Now that the mainframe code has been collected, flattened and transferred to the application drop site it should be delivered to the CAST system for analysis.  This is accomplished using the Delivery Management Tool (DMT), via the AIC Portal.   The Front Office specialist should have provided a link to the AIC Portal.  Log into the portal using your LAN credentials.

After logging in you are presented with the Delivery Launch page.  To launch the DMT click on the cloud icon.

Depending on your company's security configuration you may be asked to enter your LAN credentials. 

After creating a new version click on the plus icon, to the right of "Packages", to create a mainframe package. This displays the "New Package" dialog, select the Mainframe package type and click the "Next" button.

Create a package, for each of the PDS files, previously uploaded to the application drop site:

  • Click on the  icon to launch the PDS dump dialog box
  • Select the pds file to include
  • Select the file type
  • Change "vMAINFRAME NAME", to "MAINFRAME NAME"
  • Click on the close button 



Once configured, DMT can be invoked using a scheduler such as Jenkins to facilitate automated delivery for subsequent code drops.