Version 2.2

Purpose

This document details CAST’s Agile Onboarding Process along with the checklist templates used.

Applicability 

This Iterative Agile process is recommended to be implemented in measurement programmes with management visibility. It helps in quickly onboarding applications, with minimal technical documents needed from app teams and helps avoid tunnel effect. Agile onboarding will have three iterations and will not have any custom solution implemented.

Prerequisites for onboarding

  • CAST Application Intelligence Portal (AIP) is installed
  • The Technologies/Frameworks used in the applications are supported.
  • Extensions to be used if required
  • Application Intelligence Administrator AIA has access to CAST Virtual Machine (VM)

Out of scope

  • Rescan and rescan automation
  • Unsupported technologies
  • Clean-ups and retention

Target Audience

  • All AI admins who would be onboarding an application

Process Flow Without AIP Console

Agile Onboarding Process Overview

The agile onboarding process consists of three iterations -

  1. 'Discover' is applicable to all applications
  2. 'Onboard' is applicable only to those applications where the feedback is received from the application team on the gaps identified during rapid onboarding
  3. 'Integrate' is applicable only if there are customized measurement required by the application team based on the identified use case

Objective

The objectives of the Agile process are:

  • Minimizing initial questions to app teams
  • Avoiding tunnel effect and maintaining exchanges with app teams on progress of the analysis
  • Bringing visibility and speed in the early stages the CAST programs at customers

Approach detail

Key inputs required

Steps in Agile Onboarding Process

1. Discover

This iteration consists of three main activities i.e.

  • Qualification of the application.
  • Acceptance of the software components provided by the customer and delivering initial measurement.
  • Technology and transaction understanding of CAST consultant/architect to the customer. 

As a precondition of a CAST AIP analysis and to qualify the application from a technical and value perspective, CAST recommends gathering high level technical and non-technical information to help and qualify the target application for the analysis. The technical qualification will be used to establish what level of "out of the box" support CAST has for the application, identify any showstoppers such as the use of unsupported technologies or frameworks. It describes guidelines on delivering application source code, packaging source code with CAST DMT, validating alerts, analyzing application first time, discovering application's boundaries (Entry points/Endpoints), technology, and missing software components and delivering the relevant reports and initial measurement to the customer.

AIP Product Documentation

    1. Application Qualification process overview: https://doc.castsoftware.com/display/DOC83/Qualification
    2. Application Acceptance process overview: https://doc.castsoftware.com/display/DOC83/Delivery+acceptance
    3. Initial Analysis Configuration overview: https://doc.castsoftware.com/display/DOC83/Review+Technology+and+Dependency+settings

Entry Criteria

    • Approval to move forward either internally or from the customer, if CAST is onboarding customer
    • The application identified by the customer should have only CAST supported Technologies/Frameworks available at https://doc.castsoftware.com/display/DOC83/Covered+Technologies 
    • Technical Survey Essential / ADP Essential 

Process Steps:

  1. Kick-off Meeting: A formal meeting (preferably) has to be conducted. The agenda should be application technologies, availability of software components and introduction to CAST AIP Program
  2. App Qualified: 
    1. CAST Front Office (FO) has to verify whether the software components (source code, database and context diagram) are in line with the technical survey essential 
    2. Service Engagement Sheet has to be prepared to define the scope of the application
  3. Setup Component Delivery:
    1. Application has been registered in the AIC Portal
    2. Setup environment and deploy AIC Portal if not done already
    3. Setup Automation with Source Code Extraction (Ex: Jenkins)
  4. DMT Packaging:
    1. Include Discoverer in DMT plugins
    2. If extensions need to be installed, ensure that only Product supported extensions & agreed extensions at account level are used – https://doc.castsoftware.com/display/DOC83/Covered+Technologies
    3. Package the source code using CAST DMT to prepare the code for analysis by CAST AIP.  Refer to https://doc.castsoftware.com/display/DOC83/How+do+I+package+the+Version for more information.
    4. Deliver Application Source Code:  https://doc.castsoftware.com/display/DOC83/Delivery

    5. Accept the Delivery in DMT
  5. DMT Delivery Report: An automated report is generated using tools (PRAM & PRAT)
  6. Setup CMS, Run Analysis and Snapshot:
    1. Create the triplet ( CAST schema )
    2. Deploy the war files (Engineering Dashboard (ED)/Health Dashboard (HD))
    3. Ensure that a valid license key is applied
    4. Software component Accepted – Deliver the source code and set as Current Version
    5. Ensure that deployment, delivery, log and LISA/LTSA paths are set accurately
    6. Check that all the framework settings are configured as per the ADP Essential or Technical Survey Essential
    7. Configure include/classpaths according to the technology
    8. Ensure that Global DLM rule files are used thru the extension (FAED)
    9. Non-standard technology – Not applicable in Agile Onboarding Approach. Only Product support Extensions and agreed extensions on account level will be used
    10. Set up environment profile if needed
    11. Install extensions - If extensions need to be installed, ensure that only Product supported extensions & agreed extensions at account level are used
    12. Run Analysis & snapshot – Monitor and ensure that it is completed successfully
    13. Consolidate the data of the latest snapshot to health dashboard
  7. Analysis Completeness Report: An automated report is generated (PRAL/ FAEN/ FAUC)
  8. Completeness Check: Check if we are good to proceed further based on the Analysis Completeness Report (if more than 10% of the software components are not analyzed then make go to step 4. 'DMT Packaging')
  9. App Boundary Definition:
    1. Validate the technologies and frameworks
    2. Validate missing software components
    3. Identify and validate the Entry and Endpoints
    4. Identify and document the impact of missing software components
  10. Rapid Discovery Report:
    1. Share the Rapid Discovery Report (FAEM) with the FO/Customer

Exit Criteria

    1. Initial measurement delivered to the customer
    2. Rapid discovery report delivered to the customer

Deliverable

    1. Initial Measurement
    2. Rapid Discovery Report (Template Link - Rapid Discovery Report Template)

2. Onboard

This iteration is applicable if and only if the gaps identified as part of the rapid discovery report are filled or we have the complete source code and there is no gap identified during the Discover phase. Main activities in this iteration are fine-tuning of the analysis configuration, transaction configuration, review of transaction completeness and validation of dashboard results.

AIP Product Documentation

    1. Fine-tuning of Analysis Configuration - https://doc.castsoftware.com/display/DOC83/Run+and+validate+the+analysis
    2. Transaction Configuration - https://doc.castsoftware.com/display/DOC83/TCC+-+CAST+Transaction+Configuration+Center
    3. Review of transaction completeness - https://doc.castsoftware.com/display/DOC83/Transaction+management
    4. Validation of dashboard results - https://doc.castsoftware.com/display/DOC83/Engineering+Dashboard & https://doc.castsoftware.com/display/DOC83/Health+Dashboard

Entry Criteria

    • Completion of Discover iteration
    • Customer's inputs on the gaps identified as part of the rapid discovery report

Process Steps

  1. Gaps Addressed:
    1. Check whether the open questions as identified in the transaction report have been addressed by the customer?
    2. Ensure that the missing software components are delivered by the application team
    3. In case of any gaps that are still open, contact the application team to receive the inputs Or take an Informed exception
    If yes, handover to AIA: The Expert has to handover all the application related information along with the relevant report (Rapid Discovery Report)
  2. Repackage DMT
    1. Reject the delivery in CMS if new software components are provided by the application team
    2. Include the additional software components and repackage as per the DMT packaging guidelines
    3. Best Practices for DMT:

      1. Library files for Java technology, and assemblies for .NET technology (application-specific and 3rd party framework) should be packaged separately. For further details refer to http://doc.castsoftware.com/display/DOC83/DMT+-+Folders
      2. Exclude test files/folders/projects and generated code - http://doc.castsoftware.com/display/DOC83/DMT+-+Use+Regular+Expressions
      3. Check that the packaging is done as per DMT best practices
    4. Remediate DMT Packaging Alerts: https://doc.castsoftware.com/display/DOC83/How+do+I+handle+DMT+delivery+alerts
    5. Accept the delivery
  3. Modify CMS Setup, Run CMS Analysis:
    1. Follow all the steps that identified as applicable in step-6 (Setup CMS, Run analysis and snapshot) of the Discover phase
    2. Setup Environment & Analysis configuration 

      1. Source code “File Adjustments” (Optional)

      2. “Preprocess” Source Code (Optional). In some situations, the source code cannot be parsed and must be preprocessed. A study should be done first to identify the preprocessing rules which must be applied to the source code. Add Preprocessor batch file ‘Tools Before Analysis’ under Content Enrichment tab

    3. Verify Framework Settings

      1. Ensure that the correct versions of technologies and frameworks are selected

    4. Adding dependencies – add the dependencies between analysis units (if required)

    5. Fine-tune CMS settings

      1. Choose an Environment Profile if needed (Only Product supported profiles)

      2. Configure include/classpaths according to the technology

      3. Enable User Input Security if it is within the scope of the analysis

  4. KB Inventory Validation (FAEE / FAUC):
    1. Check if all the technologies/frameworks have been analyzed properly
    2. Check if all the files have been analyzed properly ( Files in DMT vs files in KB )
  5. Log Validation (PRAL / FAEM):
    1. Follow the troubleshooting process to remediate warning and errors by referring to product documentation available at https://doc.castsoftware.com/display/DOC83/Validate+analysis+based+on+log+messages
    2. Document the log validation results: Number of alerts, type of alerts, and tickets raised (if any)
  6. Review Dynamic Links:
    1. Create application-specific DLM rules to resolve the dynamic links
    2. Guidelines to validate the dynamic links are available in https://doc.castsoftware.com/display/DOC83/Validate+Dynamic+Links
  7. Module Configuration (FAEO / FAEC):
    1. Set up the module content according to the client’s requirement
    2. If there is no recommendation, retain the Full Content
    3. Guidelines to create user-defined modules are available in https://doc.castsoftware.com/pages/viewpage.action?pageId=200409226
  8. Health Measures Configured (FAEN):

    1. Check if the health factors have been qualified properly
  9. Architecture Checker Flow:
    1. If the app team has provided the information then configure the Architecture Flows
  10. Security Configuration:
    1. Validate the security settings and results for JEE and .Net applications if it is in scope
  11. Transaction Configuration (FAEA):

    1. Entry point configuration / Data Entity configuration / End of transaction configuration

      1. Trace the transactions based on the entry and endpoints given rapid discovery report

      2. Validate the empty transactions (FAET report will help in the identification of missing links). Document any broken transactions and missing links in Confluence

      3. Create specific entry/endpoints for transactions which have not been identified in the Rapid Discovery Report 

    2. Review Transactions

      1. Verify if there are artifacts with high Data Element Types (DET) / Record Element Types (RET) values. If there are then check the objects with issues and remediate them. Use FAES report to identify large SCC group. (If the issue still exists, raise a support ticket)

      2. Check the code coverage i.e., validate that all artefacts are covered in the transactions. Use FAER to validate the artefact coverage

      3. If there are a few artefacts which do not belong to any transaction, investigate if they are in the scope of module definition

  12. Transaction Completeness Report: Following will help you to decide whether transaction configuration is complete
    1. Use FAES for identifying large SCC group
    2. Use FAER for validating transaction completeness (it includes smell test)
    3. Use FAET for identifying missing links in empty transactions
  13. Transaction Complete: Verify FAES, FAER & FAET reports to ensure that the transactions are configured properly and complete
    1. If empty transactions exist, follow the steps below

      1. If this is a configuration issue, go back to analysis configuration

      2. If this is a standard framework supported by CAST, raise a Support ticket

      3. Add links between objects that are not recognized by AIP out of the box only if there is an approved workaround from support and R&D (through the Reference Pattern process, Knowledge Base (KB) update queries, and supported extensions to create missing objects/links)

      4. Once the link is completed in Enlighten, compute in the Transaction Configuration Center (TCC), validate the flow, and continue with activity in Step b.

  14. Run Snapshot and Validate results:
    1. Run Baseline snapshot
    2. UI validation
      1. Launch dashboard either via CMS or deployed Web Archive (WAR)
      2. Validate if Engineering Dashboard pages are displaying data correctly e.g. Transaction wide Risk Index (TwRI) page, Risk Indicator pages, Critical Violation pages, Compare Version pages
      3. If there is an issue with the above step, follow the Troubleshooting Process
    3. Inventory Validation
      1. Check the Lines of Code (LoC) per modules and validate with the inventory
      2. Validate the number of files, classes, tables, and artefacts per technology. If the numbers are not as expected, follow the Troubleshooting Process
      3. If this check is for maintenance, validate the variation in data from Quick Access view
      4. If there are a lot of unexpected added/deleted artefacts in the above step, follow the  Troubleshooting Process to fix the issue. If the issue is due to folder structures, reject the delivery and start from DMT phase
    4. Engineering Dashboard Metrics validation:
      1. Check the number of critical violations and validate a few of them to see if there are any false positives. If there is any false positive, follow the Troubleshooting Process
      2. Check the other metrics such as Technical Debt, Cyclomatic Complexity
      3. If this process is for maintenance, check the evolution
      4. Function Point (FP) count must match with FP count in TCC
    5. Analytics Dashboard Metrics validation:
      1. Consolidate Engineering Dashboard with Application Analytics Dashboard (Analytics Dashboard) if Engineering Dashboard can’t be navigated to from Analytics Dashboard
      2. Validate that the numbers displayed on the Analytics Dashboard must match with Engineering Dashboard
  15. Verify Analysis Completion Report: Use FAEN and FAER reports to ensure that the quality of analysis is good
  16. All Checks Passed: A list of checks are executed at this step to ensure the quality of results. If any check fails then it will lead to an exception review
  17. QA - Onboard phase:
    1. Review the Analysis results, entry and endpoints as defined in the transaction report, validate the smell test results justification, dashboards and delivery report as per the checklist
    2. PQM has to review the process followed so far and give the go-ahead to deliver the refined dashboard to the customer
  18. Consumption Report:
    1. Upon the go recommendation of PQM, delivery the refined measurement and Consumption Report to the application team

Exit criteria

    • Consumption Report delivered to the customer
    • Refined measurement delivered to the customer

Deliverable

    1. Consumption Report  (Sample Link - Consumption Report Sample)

    2. Refined Measurement (Engineering and Health dashboard)

3. Integrate

This phase is applicable only if there are customized measurements required by the application team based on the identified use case, after the onboard phase. If there are no customizations required by the application team then the results delivered as part of the onboard phase are the baseline results.

AIP Product Documentation

Entry Criteria

    1. Refined measurement delivered to the application team
    2. An additional requirement from the application team

Process steps

  1. Additional Inputs
    1. Check if there are additional inputs provided by the application team to make any adjustment in the Assessment model, Dashboard and Action Plan 
    2. Inputs for Automation of rescans
  2. Calibration
    1. Rule-based Ignore / Rule-based Deletion / Rule-based Grouping / Rule-based Splitting – manually check the transactions in TCC for further calibration opportunities

      1. Review the Entry & End Points list to avoid duplicate and redundant transactions

      2. Make sure the Data and Transaction filters are not ignoring, deleting, grouping or splitting wrong transactions. If there are such transactions, add them as exceptions by modifying the filter functions

    2. Rule-based value / Rule-based type

      1. Filter grouping rules based on naming, types, inheritance, and free definition

    3. Manual Grouping / Manual Splitting / Manual type Adjustment / Manual FP Adjustment (Optional)

      1. If the transaction configuration requires further calibration, manually make the adjustment

    4. Manual Count Alignment/Functional Reconciliation/Final Configuration on FP/Calibration based on customer’s feedback

  3. Fine-tune Assessment Model
    1. Update the default rule criticality

    2. Update the scoring weight

    3. Update parameterized rules

    4. Update rule descriptions

    5. Exclusion of rules
  4. Include Exclusions in Dashboard
    1. Exclusion of rules

    2. Exclusion of objects

    3. Exclusion of violations

  5. Customization of Action Plan

    1. Build the action plan based on the inputs received from the application

    2. Validate the action plan after creating the action plan

  6. Adjust Automation for Rescans
    1. Validate the Automation set up during the previous iterations
    2. Check the Automation Adjustment needed as per Customer's requirement and make necessary changes
  7. Run Baseline Snapshot
    1. Ensure that the Baseline snapshot is executed if any of the step (1 to 6) in Integrate phase is executed
    2. Delete the previous snapshot and execute the consolidation
  8. QA - Integrate phase
    1. Review the dashboards and updated delivery report as per the checklist
    2. PQM has to review the process followed so far and give the go-ahead to deliver the refined dashboard to the customer
  9. Baseline Measurement
    1. Upon the go recommendation of PQM, delivery the baseline measurement and updated delivery report/preliminary findings report to the customer

Exit criteria

    • Application baselined in AIP and automated for future rescans

Deliverable

    1. Final Consumption Report with known gaps

    2. Updated Engineering and Health dashboard

Extensions Needed in Agile Approach:

  1. Extension for DLM rule library - FAED

  2. Extension for the report to find App boundaries - FAEA

  3. Extension for the report to identify Code not part of module - FAEC

  4. Extension for the report to identify Overlap code in modules - FAEO

  5. Extension for the report to identify Missing code for java - FAEM

  6. Extension for the report to identify External code - FAEE

  7. Extension for the report to validate the analysis completeness - FAEN

  8. Extension for the report to get the missing links in empty Transactions

  9. Extension for the report to validate the transaction completeness - FAER

  10. Extension for the report to identify large SCC group - FAES

  11. Extension for the report to find the golden nuggets (Nugget Finder) - FANF

  12. Extension for the report to find the Unanalysed Code - FAUC

Product Extensions Needed in Agile Approach:       

  1. Automatic Links Validator : This extension will automatically validate DLMs

       2. Report Generation for Consumption - PRRC

Training Material: