V4.0

Process to Onboard an Application


Introduction

This document details CAST's Field Onboarding Process along with the checklist templates used.
Pre-requisites for on-boarding:

  • CAST Application Intelligence Portal (AIP) is installed
  • Supported Technologies/Frameworks
  • Extension to be used if required
  • Application Intelligence Administrator AIA has access to CAST Virtual Machine (VM)

The following topics are out of scope for this document:

  • Rescan and rescan automation
  • Unsupported technologies
  • Clean-ups and retention

Target Audience
AIA
Application Manager

Process Flow


Related processes and checklist templates

  1. Qualification Phase:
  • Template – Service Engagement Sheet

2. Acceptance Phase: 

  • Template - Source Discovery
  • Architecture Discovery Process
  • Template - Architecture Discovery Review checklist

3. Analysis Phase:

  • Analysis Configuration Phase: 
    • Template - Analysis Log Book
  • Transaction Configuration Phase
  • Calibration Phase:
  • Reconciliation Phase:
    • Template – Automated Function Point (AFP) Configuration/Calibration

4.  Delivery Phase:

  • Template - Dashboard Validation Page
  • Template - On Boarding Release Audit checklist

Qualification Phase

As a pre-condition of a CAST AIP analysis and to qualify the application from a technical and value perspective, CAST recommends gathering high level technical and non-technical information to help qualify the target application for the analysis. The technical qualification will be used to establish what level of "out of the box" support CAST has for the application, identify any show-stoppers such as the use of unsupported technologies or frameworks and any potential customization that may be required to accommodate exceptions.

Official Documentation

AIP Product Documentation for Application Qualification:

  1. Application Qualification process overview: http://doc.castsoftware.com/display/DOC82/1.3.+Application+Qualification

Entry Criteria:

  1. Application identified to be on-boarded onto CAST AIP
  2. Application Architecture Context Diagram (See appendix)
  3. Existing Application Detail Design documents if any e.g. Sequence Diagrams
  4. Access to Build Manager and/or Source Control Manager (SCM) details for source code and database connection details available
  5. Application Delivery Portal (ADP) / Technical Survey to be completed (See appendix)


Process Steps:

  1. Kick-Off meeting
    1. Meet with Application Team to understand architecture of the Application
    2. Confirm application boundaries and validate source code completeness
    3. Module Definition
    4. Update ADP / Technical Survey
  2. Technology info in ADP
    1. Verify app team contacts
    2. Verify Technology Details and Questions
    3. Verify if Architecture Context Diagram and Transaction Code Flow Document has been uploaded
  3. Front Office Qualified
    1. Front Office resources will validate above steps 1 and 2 are complete.
    2. Front Office resources will confirm using the Delivery Manager Tool (DMT) that source code delivery is complete including all dependencies.
    3. Front Office resources will complete the qualification and deliver the application to Back Office for next steps
  4. Setup Application Intelligence Center (AIC) Portal – Registering new domains and applications in the CAST AIC Portal. This step is performed to trigger the on-boarding and first-time analysis of a new application with CAST AIP
  5. Service Engagement Sheet – This sheet must be completed to define the scope of the Application
  6. DMT Package
    1. In case of extensions, ensure it is installed –http://doc.castsoftware.com/display/DOC82/Managing+extensions
    2. Package the source code using CAST DMT to prepare the code for analysis by CAST AIP. Refer to Deliver Application Source Code for more information.
    3. Deliver Application Source Code: http://doc.castsoftware.com/display/DOC82/1.4.+Deliver+the+application+source+code+version
    4. DMT Packaging Alerts:  
      http://doc.castsoftware.com/display/DOC82/DMT+-+Package+Content+tab+-+Packaging+alerts


Exit Criteria:

  1. All details and documents received and verified
  2. Complete application source code and database are delivered with dependencies


Deliverable Outcomes:

Application is configured in DMT.

Best Practices for DMT:

  1. Library files for Java technology, and assemblies for .NET technology (application specific and 3rd party framework) should be packaged separately. For further details refer to http://doc.castsoftware.com/display/DOC82/DMT+-+Folders
  2. Include Discoverer in DMT plugins
  3. Exclude test files/folders/projects and generated code - http://doc.castsoftware.com/display/DOC82/DMT+-+Use+Regular+Expressions

Acceptance Phase

This phase marks the source code delivery completeness from the Application team. It describes guidelines on delivering application source code, packaging source code with CAST DMT, validating alerts, completing the source code discovery before proceeding with analysis.

Official Documentation

AIP Product Support Documentation for Application Acceptance:

  1. Validate and Accept the Delivery:

http://doc.castsoftware.com/display/DOC82/2.1.1.++Validate+and+Accept+the+Delivery

Entry Criteria:

Qualification is completed

Process Steps:

  1. DMT Alerts
    1. Resolve ALL DMT errors, alerts and warnings per DMT Packaging Alerts
      1. Any alerts related to missing code need to be documented, and reported back to application teams for code delivery improvement  
        (http://doc.castsoftware.com/display/DOC82/Technology+Delivery+Instructions)
    1. Repackage the code.
    2. Repeat Steps a and b until all errors and warnings in the DMT log are resolved
  1. Source Code Accepted – Deliver the source code and set as Current Version
    1. Field Extension Step: Set the application Source Code Accepted Phase in ADP
  2. Application Discovery
    1. Architects review the source code and refers to the design documents provided by application Subject Matter Expert (SME).
    2. Initial dry analysis is completed in this phase.
    3. Complete the App Discovery and produce the Application Discovery Document & Consultant Help Document. Include discovery questions, if any, in the Discovery Document to cover the Functional view of the application
  3. App discovery Acceptance
    1. Peer Reviewer reviews documents for this phase and updates the checklists
    2. In case of 'Good to go', the App Discovery Document is submitted to the client. If not, it will go back to Step 3a
    3. Receive signoff of Application Discovery Document from the client completing the initial acceptance and verify the following items:
      1. Entry & Exit Points
      2. All the relevant flows
      3. Additional source code delivered (if any); any changes will go to Step 3a
  4. Handover to AIA – Conduct a handover meeting with Tech Lead/Consultant and Solution Design (SD) Team (if required based on the engagement/project and SD's involvement) to share knowledge on the application and discuss CAST constraints

Exit Criteria:

  1. Receive signoff from the Application Team
  2. Application Discovery Document completed
  3. Consultant Help document completed

Deliverable Outcomes:

  1. Approved Application Discovery Document
  2. Handover to AIA

Analysis Configuration Phase

The process below provides an overview of the Application Source Code Analysis process with CAST AIP. It consists of two phases: Analysis set-up and Analysis Execution. An optional, although strongly recommended Analysis Automation phase, may follow.

Official Documentation

AIP Product Support Documentation for Application Analysis:

  1. Application Analysis Process with CAST AIP: http://doc.castsoftware.com/display/DOC82/2.+Application+Analysis+Process+with+CAST+AIP

Entry Criteria:

  1. Availability of technical knowledge of configuration
  2. Application Discovery, Consultant Help Document, and Handover Document to understand application scope, technology, and extensions (if any)
  3. Current version should be set during the Acceptance Phase
  4. Source code acceptance - Error/alert-free packages and all the exclusions (test code, 3rd party frameworks, etc…) should be completed during DMT packaging


Process Steps:

  1. Set-Up Environment & Analysis configuration with respect to ADP and source code discovery
    1. Source Code "File Adjustments" (Optional)
    2. "Pre-process" Source Code (Optional) - In some situations, the source code cannot be parsed and must be pre-processed. A study should be done first to identify the pre-processing rules which must be applied to the source code. Add Pre-processor batch file 'Tools Before Analysis' under Content Enrichment tab
    3. Repackage DMT (Optional) if above steps are applicable – Application code must be repackaged to reflect changes
  2. Non-standard technology – In case the technology is not supported but in scope, do a feasibility study to support through a custom development; or do a post feasibility study to exclude it from the technology boundary
    1. If the application scope includes non-standard technology, install a supporting custom Extension as needed and import into the Assessment Model (optional)
  3. Verify Framework Settings
    1. Select correct versions of technologies and frameworks
  4. Setup Cast Management Studio (CMS)
    1. Choose an Environment Profile if needed
    2. Configure include/classpaths according to the technology
    3. Enable User Input Security if it is within the scope of the analysis
  5. Run CMS Analysis - http://doc.castsoftware.com/display/DOC82/CMS+-+Run+analysis+only
  6. Log validation – Perform analysis log validation
    1. Follow the troubleshooting process to remediate warning and errors using different platforms (if any)
    2. Document the log validation results: Number of alerts, type of alerts, and tickets raised (if any)
  7. Resolving warnings and errors
    1. If no warnings or errors exist at the end of Analysis, proceed to Step 8
    2. If warning or errors are found, return to Step 4 to correct them
  8. Results validation – Perform inventory validation
    1. If the delivered code matches the analyzed code (unless excluded because it is out of scope), proceed with following steps a, b, and c
      1. Validate Dynamic Links / Dynamic Link Rules creation – Document the number of Dynamic Links.
      2. Parameterization – Document the number of Parameterization Steps
      3. Module Definition
        1. Set up the module content according to the client's requirement
        2. If no recommendation, retain the Full Content
      4. In case of any mismatch, check if any paths have been missed in the analysis units. If yes, apart from below 4 steps, analysis configuration and analysis units must be re-visited
        1. Adding dependencies
        2. KB Update Assistant
        3. Ref Finder
        4. Return to Step 4

Exit Criteria:

  1. Log validation
  2. Inventory validation
  3. Post-analysis validation


Deliverable Outcomes:

  1. Error-free analysis results along with inventory and log validation


Best Practices:

  1. Add Class Paths, Include Paths, Working Folders, and Header Files
  2. Select the appropriate Integrated Development Environment (IDE) for each technology
  3. Retrieve Macro list from source code
  4. Use only supported versions of extensions (Alpha or Beta not supported)

Transaction Configuration

The CAST Application Intelligence Platform enables users to measure Function Points from the source code of an application. CAST AIP measures the functional size of an application using Object Management Group (OMG)-compliant Automated Function Points.
However, Function Points is a functional size measure that is based on the application's specifications and requires knowledge of the intent of the application's designers ("primary intent" mentioned in the OMG-compliant Automated Function Points Counting manual). So, to make the measure possible, Function Point (FP) Experts and Consultants need to calibrate the initial automatic count made by CAST. Calibration includes removing technical and temporary objects from the list of Function Points counted, aggregating, and splitting several function points into one and changing the type of the Data or Transactional Functions.

Official Documentation

AIP Product Support Documentation for Transaction Configuration:

  1. How to configure TCC: 
    http://doc.castsoftware.com/display/DOC82/TCC+-+CAST+Transaction+Configuration+Center

Entry Criteria:

  1. Analysis results are validated, and a snapshot is available
  2. Transaction Configuration Kit is available

Process Steps:

  1. Deploy Transaction Kit – Standard Entry and End points will be available from the technology specific configuration kit. It is available for download from the following link (Link to download Calibration Kit)
  2. Entry point configuration / Data Entity configuration / End of transaction configuration
    1. Trace and save all the transactions (identified in Consultant Help Document) in Enlighten
      Note: There should be a one-to-one mapping between the Enlighten Diagrams and the flows identified in the consultant help document.
    2. Validate the empty transactions. Document any broken transactions and missing links in Confluence
    3. Create specific rules for transactions which have been identified in the Application Discovery Document but not configured in TCC through the Transaction Kit. These rules include configuration of Transaction Entry Points, End Points, and Data Entities
      1. Verify with the Technical Lead if the new transactions can be added
      2. If yes, communicate this to the Architect and get a sign off
      3. Document the transactions properly and provide a feedback in the standard AFP Configuration/Calibration page
  3. Review Transactions
    1. Verify if there are artifacts with high Data Element Types (DET) / Record Element Types (RET) values. If there are then check the objects with issues and remediate them. If the issue still exists, raise a support ticket
    2. Check Data Production Tracker (DPT)/ADP to see if there are any recommendations for changing default FP values
    3. Check the code coverage i.e., validate that all artifacts are covered in the transactions
    4. If there are a few artifacts which do not belong to any transaction, check if they are in scope of module definition
    5. Generate and share the list of artifacts which are part of module definition but are not covered in transaction with the Tech Lead. Take the necessary remediation steps suggested by Tech Lead
    6. Share the list of artifacts with Architect obtained from step e
    7. If the code coverage is in an acceptable range, document the action details taken in the above steps
  4. Transaction Completeness
    1. If empty transactions exist, follow the steps below:
      1. If this is a configuration issue, go back to analysis configuration
      2. If this is a standard framework supported by CAST, raise a Support ticket
      3. Add links between objects that are not recognized by AIP out of the box through the Reference Pattern process, Knowledge Base (KB) update queries, and supported extensions to create missing objects/links
      4. Once the link is completed in Enlighten, compute in the Transaction Configuration Center (TCC), validate the flow, and continue with activity in Step 2
    2. Code coverage must be in the acceptable range

Exit Criteria:

  1. Document all the configuration and code coverage details
  2. Launch snapshot

Deliverable Outcomes:

  1. Transactions configured for the Application

Calibration and Reconciliation

The main aim of the CAST Transaction Configuration Center is to allow you to calibrate objects with regards to their Function Point size - in other words, you can choose how an object is interpreted by the CAST algorithm and thus how its Function Point size is derived (or you can manually set a Function Point size for an object), instead of having to use the default CAST configuration, which may not always be appropriate. This section provides information about how you can do this using the CAST Transaction Configuration Center. 

Official Documentation

AIP Product Support Documentation for Transaction Calibration:

  1. Transaction Calibration:
    http://doc.castsoftware.com/display/DOC82/TCC+-+Calibrate

Entry Criteria:

  1. TCC configuration is done
  2. Analysis results are validated
  3. Standard calibration kit is used
  4. New snapshot has been computed

Process Steps:

  1. Deploy Calibration Kit – download the latest function point calibration kit for your version of AIP from the CAST Extensions link. Apply the kit as described in the documentation. This will calibrate function point counts by removing or ignoring redundant transactions, and unifying duplicates.
  2. Rule based Ignore / Rule based Deletion / Rule based Grouping / Rule based Splitting manually check the transactions in TCC for further calibration opportunities.
    1. Review the Entry & End Points list to avoid duplicate and redundant transactions
    2. Make sure the Data and Transaction filters are not ignoring, deleting, grouping or splitting wrong transactions. If there are such transactions, add them as exceptions by modifying the filter functions
  3. Rule based value / Rule based type
    1. Filter grouping rules based on naming, types, inheritance, and free definition
  4. Run a Preliminary Snapshot
  5. Manual Grouping / Manual Splitting / Manual type Adjustment / Manual FP Adjustment (Optional)
    1. If the transaction configuration requires further calibration, manually make adjustment


Exit Criteria:

  1. Document the configuration
  2. Launch a snapshot

Deliverable Outcomes:

  1. Transactions calibrated for the Application

Delivery Phase

The Delivery phase provides guidelines on how to run a snapshot, deploy dashboards, and hand over the results to the customers for consumption.

Official Documentation

  1. AIP Product Support Documentation for End User Guides (installing dashboards):
    http://doc.castsoftware.com/display/DOC82/End+User+Guides

Entry Criteria: 

  1. Function point configuration is completed
  2. Module definitions are properly configured in Analysis phase
  3. Scope and result expectations are verified
  4. If there are any specific credentials for Engineering Dashboard, it should be frozen during project phase
  5. Custom reports should be decided during the acceptance (if any)


Process Steps:

  1. Run Baseline snapshot
  2. UI validation
    1. Launch dashboard either via CMS or deployed Web Archive (WAR)
    2. Validate if Engineering Dashboard pages are displaying data correctly e.g., Transaction-wide Risk Index (TwRI) page, Risk Indicator pages, Critical Violation pages, Compare Version pages
    3. If there is any issue with the above step, follow the Troubleshooting Process
  3. Inventory Validation
    1. Check the Lines of Code (LoC) per modules and validate with the inventory
    2. Validate the number of files, classes, tables, and artifacts per technology. If the numbers are not as expected, follow the Troubleshooting Process
    3. If this check is for maintenance, validate the variation in data from Quick Access view
    4. If there are lot of unexpected added/deleted artifacts in the above step, follow the Troubleshooting Process to fix the issue. If the issue is due to folder structures, reject the delivery and start from DMT phase
  4. Engineering Dashboard Metrics validation:
    1. Check the number of critical violations and validate few of them to see if there are any false positives. If there is any false positive, follow the Troubleshooting Process
    2. Check the other metrics such as Technical Debt, Cyclomatic Complexity
    3. If this process is for maintenance, check the evolution
    4. FP count must match with FP count in TCC
  5. Analytics Dashboard Metrics validation:
    1. Consolidate Engineering Dashboard with Application Analytics Dashboard (Analytics Dashboard) if Engineering Dashboard can't be navigated to from Analytics Dashboard
    2. Validate that the numbers displayed on Analytics Dashboard must match with Engineering Dashboard
  6. Delivery Report Generation
    1. Delivery reports must be prepared once all the validations are completed
    2. Prepare and document custom reports in Confluence (if any)
  7. Maintenance documentation
    1. This document groups together all miscellaneous backup, optimization, and maintenance documentation related to the CAST Storage Service
  8. Automation
    1. This page describes how to install and use the CAST Analysis Automation Application Programming Interface (API) to automate the process of analysis with CAST AIP


Exit Criteria:

  1. All validations completed must be properly documented and the reports must be uploaded in Confluence


Deliverable Outcomes:

  1. Application onboarding completed.


Troubleshooting Process


Entry Criteria:

  1. Errors/warning/alerts are encountered in the CAST tools (DMT, CMS, Enlighten, TCC)
  2. Global Application Intelligence Center (GAIC) troubleshooting guide

Process Steps:

  1. When an error or warning is encountered:
    1. Refer to the GAIC troubleshooting guide for CAST tools which gives information about:
      1. Which errors/warnings/alerts can be ignored?
    2. Remediation, resolution, and workaround details
      1. Refer to Technical Knowledge Base (TKB) for known issues and limitations
      2. Refer to Zendesk for similar issues faced
      3. Raise a Support ticket
      4. Consult Product Management (PMT) or the Research and Development (R&D) team
      5. Consult SME if recommended by Support or PMT


Exit Criteria:

  1. Verify if the above steps have solved the issue


Deliverable Outcomes:

  1. All errors, warnings and alerts are remediated

Best Practices:

  1. Refer to the Zendesk for Solution Delivery best practices


Consumption Tools

Engineering Dashboard

CAST Application Engineering Dashboard (AED) is the technical dashboard to review application analysis results. Engineering dashboard lets you drill down into the violations down to the violating object list and view non-compliant source code. Here are more details about application engineering dashboard.
http://doc.castsoftware.com/display/DOC82/CAST+Application+Engineering+Dashboard+-+CAST+AED

Analytics Dashboard

CAST Application Analytics Dashboard (AAD) is core AIP tool for consuming portfolio level analytical information, Target uses of AAD are portfolio management and application management group. AAD provides analytical information, trending and consolidated information of all applications in the portfolio. Here is the cookbook for various functionalities AAD offers
http://doc.castsoftware.com/display/DOC82/CAST+Application+Analytics+Dashboard+-+CAST+AAD

Report Generator

The CAST Report Generator is a standalone solution for automatic generation of reports based on CAST AIP results. The solution provides the opportunity (for example) to prepare and automate assessments for each new version of application analyzed with CAST AIP. Documents are based on Microsoft Office templates and they can be modified to prepare a specific template to meet a particular use case or to comply with a company format. After generation, the resulting document can be further adapted if necessary.
The Report Generator is based on the CAST RestAPI meaning that CAST AI Administrators and project managers alike can use the tool. Here is the link for more information on CAST Report Generator.
http://doc.castsoftware.com/display/DOCCOM/CAST+Report+Generator

Advanced Analytics Report (AAR)


Overview

Target audience - Business Analyst and Key User who are interested in using CAST results analytic as part of advanced data analysis reports that may blends CAST metrics and own data.
Summary This document provides installation an deployment instruction of the back end solution and the configuration of automated data-feed and core set of standardized views. Report templates based on this solution may be available as separate filed extensions.

Key Benefits

Enabling a new way to consume CAST analytics - Organizations can use the existing talent base and tools to rapidly discover new business insights from CAST Analytics, build ad-hoc dashboard and report that better cater to specific needs and goal of IT management and business users. Create blended report that correlate and links internal metrics from internal ERP, financial and production monitoring systems to investigate date, exposes correlation and interdependency and inform fact based decision.
Efficiency with easy governance: no need for unnecessary ETL cycles and schema maintenance activities, but still ensure governance through easy-to-deploy granular access controls. Even the most complex AAR views can be developed and deployed to users without complex or lengthy data preparation and regardless of input data structural complexity and organization.

DATAPOND: built upon REST to create / update data tables in a report manner

Use Cases

  • Self-service raw data exploration: You can explore and analyze CAST raw data sets of any complexity as they are produced, using familiar Business Intelligence (BI) tools. You can now use SQL to natively query and manipulate complex/semi-structured CAST Rest API JSON data. Blend Rest API query with traditional CAST storage and dashboard services SQL query to gain new insights and more effectively determine what is useful and what is not, without spending lots of IT cycle.
  • Business intelligence on CAST analytics:  Specialized ANSI SQL extensions allow for instant flattening and native querying of even the most complex nested data. You can perform secure, concurrent, low latency SQL analytics your IT data. You can do instant joins across CAST and your own data to create blended data report and dashboard that highlight correlation and interdependency between CAST analytics and your own business and operational metrics. 

Description

AAR solution is built upon Apache Drill and is fundamentally a query engine that support a variety of NoSQL databases and file systems, including HBase, MongoDB, MapR-DB, HDFS, MapR-FS, Amazon S3, Azure Blob Storage, Google Cloud Storage, Swift, NAS and local files. A single query can join data from multiple datastores. For example, you can join Rest API JSON response, to text file output to a PSQL query to a Central Schema to a user profile collection in MongoDB.
Leveraging Drill JSON data model AAR support queries on complex/nested data as well as rapidly evolving structures. AAR/Drill provides intuitive extensions to SQL so that you can easily query complex data.
AAR queries results in virtual datasets (views) that can be mapped into BI-friendly structures which users can explore and visualize using their tool of choice.  Business users, analysts and data scientists can use standard BI/analytics tools such as Tableau, Qlik, MicroStrategy, Spotfire, SAS and Excel to interact with CAST Rest API JSON response and/or SQL response (and any non-CAST non-relational datastores) by leveraging Drill's JDBC and ODBC drivers. 
A standard set of core views that combine selected CAST outputs are deployed and configured as part of the initial installation and ready to use in any of the sported data visualization solution.

Cast2Jira

The CAST AIP Action Plan to Jira Jenkins plugin is designed to allow the user to export the contents of the CAST Engineering Portal (CEP) Action Plan to Jira as a Jira ticket. This plugin connects with CAST Database and extracts action items based on priority of the action items and uses Jira rest API for creating the tickets.
Below is the documentation to configure and use the plugin

Appendix

Template – Qualification Phase

  1. Service Engagement Onboarding Template

    Service engagement Sheet - V1.3



    #

    Service engagement question


    1

    Documents



    ·         Technical Survey



    ·         Others if available


    2

    Offshore deadline (specify the date)


    3

    Client presentation deadline (specify the date)


    4

    Split by Version – specify versions and sequence


    5

    Client constraints and background - support for estimation



    ·         Explain constraints & background



    ·         PS project for booking


    6

    Primary regional contact for front office validation and source code extraction support


    7

    Do you want us to publish the results to send the dumps via File Transfer Protocol (FTP)?


    8

    Source code



    ·         Location



    ·         Current account details


    9

    Analysis type



    ·         Pilot



    ·         Initial analysis service



    ·         Recurring analysis service


    11

    Cast AIP version


    12

    Aggregation type


    13

    Do we need to enable Security Flow Analyzer (Y/N)



    ·         Specify sanitation methods if available


    14

    Do we need to enable escalated links (Y/N)


    15

    User Designed Module (UDM) design (mark your choice and provide the basic)



    ·         Technology split



    ·         Functional Domain split



    ·         Other, please specify


    16

    Transaction configuration



    ·         Required Y/N


    17

    Snapshot labeling 



    ·         Default will be the execution date


    18

    Extra Extra Large (XXL) information



    ·         Required Y/N


    19

    CAST Architecture Implementation



    ·         Required Y/N


    20

    Diagnostics update required



    ·         Specify what needs to be changed


    21

    Specific FO validation points that are important to know


    22

    Comments/other inputs


  2. Service Engagement Onboarding template

3. Technical Survey

Latest version is available at  Technical Survey#Technical_Survey_LIGHT

Template – Acceptance Phase

Source discovery

  1. Source code comparison checklist with drop site – The below checks must be performed on all delivered source files provided by the Application Team before configuring DMT packages

    #

    Check

    OK?

    Justification for Exceptions

    1

    Match the provided source code at drop site with the Technical survey and architecture document (look for the associated technology file ext.)



    2

    Map the high-level module mentioned in the architecture document with the provided source (one way is to look for matching folder names)



    3

    Cross check the framework related files (e.g. Struts) mentioned in the Technical Survey with the provided source



    4

    Folders with release names or branch names (like Trunk, dev, head, svn) should not be part of source delivery. If it they exist, cross check the actual structure of source code with Client (if required) or remove/rename before packaging



    5

    Folder name must not contain the version number as this will cause added/deleted objects in rescans. Folders should be renamed without version numbers and both folder structure and naming conventions must be maintained in all future rescans



    6

    In case of duplicate code (e.g. if project name is different but the code is same, or if there are any backup folders), verify findings with the Client.  Duplicate code should be removed after client confirmation



    7

    Take backup of drop site source code at:

    Source: R:\Sources\[App Name]\[Version]Source_Provided\Source

    DB: R:\Sources\[App Name]\[Version]Source_Provided\DB



    8

    Verify that project files are delivered (e.g. csproj are delivered in .NET, project files or .pom files in java)



  2.  Source code comparison checklist with DMT Package – Below checks must be performed during DMT packaging

#

Check

OK?

Justification for Exceptions

1

Cross check if the package for Drop Site root folder (containing the source) is selected. (DMT should be able to discover all technologies)



2

If SVN/TFS/Git: make sure only the source folder is selected and no release names or branch names (like Trunk, dev, head, svn). Also, cross check credential for SVN/TFS/GIT



3

Database check: If it is an offline extraction, check if all the schemas is present as mentioned in Technical Survey. If it is a live connection, select all schemas in DMT as mentioned in Technical Survey



Onboarding Checklist

Ensure that the DMT source packaging has been configured correctly as per the technologies in the Application. For further details on how source code is delivered in DMT for each technology, refer to http://doc.castsoftware.com/display/DOC82/Technology+Delivery+Instructions

#

Check

Tools/ References

Result

1

List of technologies as compared to technical survey or ADP


Technologies seen in ADPTechnologies discovered in DMT




2

Ensure all code not declared but delivered are reported in the App discovery document



3

Ensure all code declared but not delivered are reported in the App discovery document



4

Any un-supported technologies for the version of CAST be used. List them in Results if Found, reject the application if the technology is breaking transactions or intimate client that portion of the code will not be covered

Refer to supported technology section in Documentation


5

Are all the DMT alerts re-mediated?



6

Are DMT filters added to exclude test code?

Configuration Best Practices Documentation


7

Are DMT filters added to exclude not relevant file extensions?

Refer to Best practices for relevant technology (Ex: remove .exe, .so, .pch in C/C++) / Remove not relevant extension such as (.docx, xls, ...).
Also to exclude not relevant extension refer Delivery Manager Tool - Information - File Types and folders which can be ignored


8

Are DMT filters added to exclude application team provided exclusions?

Refer ADP


9

Are the files without extensions justified?

Refer to package review tab in DMT for the count of such files


10

Are archive files(war/ear/tar/zip) delivered through DMT?

Refer to Best practices for relevant technology(unzip, look for source/config xml files, check with client, redeliver)


11

For XXL diagnostics use case : Check if *.sqltablesize files is delivered

Refer to the kick-off slide/service engagement sheet


12

Check with Application team how the Materialized views are used in application (if any)?




Additional Checks for DMT  


13

Identification of middleware and reporting frameworks

Mention the usage of these specific 3rd party module to the AIA to configure such entities correctly. An extension can be leveraged to cover 3rd party elements such as BRMS, SOA, JasperReport, BIRT, etc.


14

Identification of 3rd party JavaScript libraries

List the standard files of third party JavaScript libraries. Mention the presence of 3rd party JavaScript libraries to the AIA and AIA should ensure that these files are analyzed but excluded from the modules.


15

Search the delivered code to discover some expected technology, and then confirm if those technologies have been identified during the App Discovery phase.

Execute the smell tests. Example: For any Mainframe application, if database has not been delivered, make sure to check the delivered source code for DCLGEN statements. If DCLGEN statement is present, it implies that the application uses a database which has not been delivered. And the next step would be to reach out to the Application team for the database.
Additional checks should be performed after source code deployment to ensure completeness of the delivered code. If the source code has any EXEC CICS syntax, then CSD files should be delivered. Similarly, the presence of the syntax CBLTDLI indicates that IMS files should be delivered.
Leverage SCCount tool as example.
For any Java application, make sure that the jar files delivered belong to that application. Also, select Java reference in the class path and not the corresponding jar file.


16

Has initial analysis been completed with the packaged source code?

Initial Analysis (Out of the box analysis) - After setting the current version, install the required Technology/Framework extension, Configuration/Calibration Kits and launch Analysis +snapshot.



3. DMT report

Attach the report generated by DMT which lists file counts and total size per extension. The information from DMT can be used to fill the below table

Extension

File Count

Size

Technology

Should be analyzed? (Yes/No)

Justification if No








4.  Technology & Framework Analysis Approach

Following approaches can be implemented to configure an application for analysis:

Supported Technologies can be found at http://doc.castsoftware.com/display/DOC82/Supported+Technologies

5. Source Code Validation Summary 

#

Check Point

Trigram (Role)

Validation Result (Accepted / Rejected)

Date Updated

1

Are all the above checks completed and is the source code ready be accepted for analysis?


<In case of rejection, email needs to be sent to FO>



6. Architecture Discovery Process

This section is detailed in the presentation below (Architects_User_Document.pdf).

7. Architecture Discovery Review checklist

Quality check to ensure the completeness and correctness of the Architecture Context Document

Transaction Flows 


Yes/No/NA

Comments

1

Check if all the technologies mentioned in Technical Survey are referred in the Application Architecture Overview



<Do not leave this blank>

2

Check if all the file types delivered as part of code base are covered in Application Discovery



<Do not leave this blank>

3

Check if all the un-analyzed file types are explicitly mentioned in the application discovery



<Do not leave this blank>

4

Check if the Application Context Diagram/Transaction flows given by the client corresponds to the source delivered through DMT



<Do not leave this blank>

5

Check if the frameworks identified in the application are referred in the Application Architecture Overview



<Do not leave this blank>

6

Check for the validity of any new transaction flow discovered by the Architect



<Do not leave this blank>

7

Check if the diagrams adhere to the standards described for the project



<Do not leave this blank>

8

All unique flows should have corresponding sample in the Consultant Help Document



<Do not leave this blank>

Entry Point/Exit Points 

9

Check for standard Entry/Exit Points for the framework identified are included in the App Discovery.
In case of any deviation from the standard Entry/Exit Point, is there a valid justification?



<Do not leave this blank>

10

Check if all the Entry/Exit Points identified in the Transaction Flows are included in the App Discovery



<Do not leave this blank>

11

Check for the validity of new Entry/Exit Points discovered by the Architect



<Do not leave this blank>

12

The Entry/Exit points should be well defined and clear to consultant/Lead



<Do not leave this blank>

Assumptions and Open Questions 

13

Check if the Assumptions and Open Questions are valid and can be understood by the client



<Do not leave this blank>

14

Check if all the open questions are assigned an assumption



<Do not leave this blank>

Reviewed By 



Approved (Yes/No)


Date

< Trigram >






Template – Analysis Configuration Phase

Analysis Log Book

  1. Open Issues/Questions
    This section contains question/issue from the Source discovery phase till the analysis is completed. AIA working on the analysis has the responsibility to log the questions and update status whenever there is a resolution / response.
    Risk in below table can be one of the following:
    • Blocking - Something we cannot proceed with
    • Critical - We can move but will impact the analysis results
    • Low - Can be ignored - little impact of results but need confirmation


#

Issue /Question

Reporting Date

Risk

Impact

Action On

Resolution

Resolution Date

Status

1









2









2. Application Discovery Handover

For application architecture details, refer to the Application Discovery document, Consultant Help Document, and Client approval email in the AIC tracking page for the application


Date

Checklist

1


Template - Architecture Discovery Hand Over Meeting Checklist


3. Pre-analysis Checklist

Below mandatory checks should be performed before and after an analysis for all versions. Once checks are done, state status and add comments when needed.

#

Checkpoints

Checked(Yes/No/NA)

Comment

1

License from the customer to be applied.



2

Correct Application name and version to be added. DMT version date should be source code delivery date.



3

Check log file, LTSA and LISA location setting inside CAST MS preferences. platform settings.



4

Automation Framework (AOP-Jenkins): Ensure the application is configured in AOP.


<Check with Lead/DM for details>

6

Make sure that all the files discovered, and Technology/framework discovered are configured in CMS.



7

All frameworks should be identified, and versions should match the code.



8

All the required extensions as per app discovery are installed and configured.



9

Check technology specific settings like classpath/working folder for J2EE, include path for C++ and working folder for Mainframe jobs are configured



10

Add dependencies for frameworks/technologies



11

Add default dynamic link rules


production tab

12

Add required content enrichment jobs as per discovery document



13

User input security to be activated and configured [black boxing, sanitization method, entry point methods] only when there is a requirement from Client



14

If XXL table info has been delivered, make sure the server name as well the schema name matches with schema analyzed.



15

Architecture rules should be defined as per the inputs from architects



16

Default module configuration to be applied unless there a specific requirement from client.




4. CMS setup validation (QC2)


S.No

 

Comments

Assessment

1

CMS preferences(log/LISA/LTSA/Delivery/Deploy) are aligned with the best practices and project specific environment settings



2

All the files and extensions retrieved by DMT are referred in CMS





(Proceed or rework)






3

All the frameworks with right versions and technologies defined in the App discovery document are configured in CMS


4

Any CAST extension required to analyze an application is installed


5

Apply technology content enrichment jobs, dependencies, DLM rules, Architecture rules


6

Confirm if the server configurations (Hard disk space, RAM, Free CPU) are sufficient for the volume of the code being analyzed
Refer to deployment sizing guidelines


7

Verify the System Name, application name, version, label in the analysis settings in CMS.


*  Analysis can be started only when QC2 is passed.


5. Assessment Model Settings (based on customer requirement)

#

Check

Answer

1

What is the consolidation type of assessment model?


2

Any requirement to disable the metrics? If Yes, please specify what all metrics are disabled.


3

Any requirement to change diagnostic parameter/weight/threshold? If Yes, please specify the changes.


4

Any requirement of creating any tech criteria/ health factor metrics? If Yes, please specify


5

Any requirement to change in the critical flag? If Yes, please specify the changes


6

If any of the Extensions used for the app contains quality rules, check that the quality rules are present in the application assessment model



 6. Analysis Details

 

Source Code Preparation

Details / Screen Shots

Comments / Details

1

Database Code, any custom change in the Instance name or schema name

(Provide the details of Instance and schema names configured and how it is taken care of in the automated re-scans)



2

Code - What organizational steps are taken to convert delivered source code into analyzable source code?

Is it compatible with automated re-scans?


Pre-process the code to an analyzable format

3

Code - Test code removal actions for all layers including database

Should be done in DMT to handle automated re-scans



4

Code - Any Pre-processing done on the source code should be documented and added as a pre-analysis job in CMS



 

Current version (Analysis Units)

 

Custom/User defined analysis units

5

Create relevant analysis units if it is not created by DMT



 

Analysis Tab

 

 

6

Verify/Add extensions for the files which are not covered by the default setup



7

Ensure the following are configured: Working Folder/Classpath/Includepath as appropriate


  • If pointing outside of Deployment folder, must be on a location which is accessible from all analysis servers which will be used
  • If inside the Delivery, must be a version independent path

8

Create/import required custom environment profiles.



 

Dependencies

 

 

9

Verify/Add dependencies for analysis units and Reference patterns/ Refine targets



 

Production Tab

 

 

10

Configure additional DLM Rules to validate/invalidate dynamic links as required (Provide details in Section 6 below)



 

User Input Security (only if required)

 

 

11

Setup the user input security parameters [user input methods, Blackboxing, target methods]



 

Content Enrichment

 

Explain the relevance of below jobs with Enlighten Diagrams

12

"Tools before Analysis" jobs


Add explanation for all the jobs

13

"Tools after analysis" jobs


Add explanation for all SQL jobs/UI here

14

Import all relevant enrichment jobs


Any Sql queries for marking artifacts as external?

15

UDM Configurations


Recommended to use default module configuration unless there is a specific requirement

16

Import all relevant SQL jobs for post module generation



 

Analysis Execution

 

 

17

Were any crash/errors/critical warnings observed during analysis? If yes, what is the action taken to resolve it?






7.  Log Validation 

In case of multiple runs, please make sure to update the last run details
Rationale: CAST Analyzer reports warning and error through the generated .castlog files. However, log files could be tedious to read. Log manager inserts all entries in a table, so instead of reading the logs, we can count the number of most important messages in that table. A difference in the count between V2 and V1 (previous) is an indication of the validity of the analysis.  If source code size increased by a little (LoC), and count of error messages increased a lot, this is a gap which must be investigated, and is worth investing time on.  If increase in source code size and number of error messages are in line, then this is a waiver to bypass precise checks, and hence save time during the reanalysis.
Summary of Warnings/Errors 

 

Name of Error/Warning

Root Cause

Action taken

Reported to error documentation library/TGU (Yes / No)

1





2






8.  DLM Validation  

In case of multiple runs, please make sure to update the last run details. Use Analysis Operation Portal (AOP) to obtain the DLM report to verify DLMs


Topic

Answer (Yes/No)

Justification Comments

1

Do you see lot of links in unexpected caller and callee combinations? Example: Calls from eFile to Database?



2

If there is database, is there a case of no DLM from application to database? You should investigate if this is the case.



3

DLM parameterization and DLM rules details.




9.  Boundary Definition

In case of multiple runs, please make sure to update the last run details


Boundary Actions

Action Details

1

Was any exclusion implemented? If yes, add the list of exclusions made.
Exclusions should be handled for automated re-scans


2

Provide the list of schema/DB analyzed in the application


3

Are all the files discovered by DMT analyzed and are all covered in module(s)? If not, please provide the justification.


4

Provide the details of module definition and list here.
Has module definition been done based on explicit query, provide the query if the performance of the query has been checked



10.  Analysis results validation (QC3)



Check Point

Checked (Yes/No)

Validation Result (OK/Actions Assigned)

Date updated

1

All files discovered by DMT are analyzed and they are part of the module(s). If not, check if proper justification is provided. Follow Consistency Review checks. 




2

All technologies and frameworks mentioned by the architects in discovery document are properly setup in CMS and corresponding results are validated




3

Review the links created by customization. Make sure no irrelevant links are created between technologies/framework




4

DLMs have been reviewed and parameterization or rules have been added




5

Analysis Configuration has been validated by Lead to make sure it is healthy and all the checks in the page are confirmed and documented




6

Module content validation: All artifacts are a part of module content. If some artifacts/files are missing, provide justification




7

All database subsets have been removed from the analysis




8

Analysis execution validation and Automation Validation completed. This includes check of log files and validation of automation to ensure execution has gone without issues.
Confluence must be updated before lead validation



 
 

9

Consolidation level is set as per the service engagement sheet. If not, it should be with product default.




10

Logs validation has been completed.




11.  Backup


Action

status

Comments/Details

1

Take Backup of local, central and mngt schemas and update the details in section "Backup Details (Manual)"


Use below naming convention for the backup filename/comments

 [Schema name]_[ver]_aft_QC3_ddmmyyhhmm(24hrformat)

Note: Once onboarding is completed, make sure this backup is cleaned up

For Pilots projects: don't delete the backup

Backup Location:

           R:\Sources\[Appname]\Schema_Backup

Template – AFP Configuration/Calibration Phase

  1. Entry Points

    #

    Type of entry point

    object Type

    Description*

    1

    Ex: jsp


    Why this choice?

    *For standard Entry points, mention as "Standard"

  2. End Points / Data Function

#

Type of endpoint

Object Type

DET (8.0 and above)

RET (Only applicable for Data Function)

Contribution

Description

1

Ex: CFT file

DataFunction/Exit point

what is its role in the application?




Why this choice?


3. Specific Links added / Content Enrichment

#

type

Name

Details here

(Explain if the pattern is a standard one and can be used as an extension)

Link to standard Script/Query used (if any)

1

Update Cast Knowledge Base




2

Reference Finder




3

Universal Importer




4. Transaction sample

#

Name

Enlighten Print

Flow Pattern ID 

Enlighten file name 

1

Ex: JSP to Database

<Enlighten screen shot>

Flow_1_JSP_TO_DB

jsp_to_db

2

Ex: JSP to Database via procedure


Flow_2_JSP_TO_DB

jsp_to_proc_to_db

*Order, Flow Pattern ID, and Objects in Enlighten Print should match with Architect document (Consultant help document)


5. Analysis Execution

S.No.

 Check Point

Details / Screen Shots

Comments / Details

1

Were there any large SCC groups identified in the analysis results? Have they been handled?

Check the DssEngine*log.txt file which is generated during snapshot for the application and look for the following section 

Also, high DET & FTR values for transactions are an indication of existence of SCC groups

Refer to the troubleshooting guide in official documentation for steps to resolve this condition


Provide the steps that were taken to remove the condition



6. Smell tests

Refer to the documentation to get details of the smell tests (https://doc.castsoftware.com/display/FBP/Offline+Smell+Test+-+OST):

Smell Test

Test Result (Red/Amber/Green)

Details and justification

Artifact Coverage Ratio



Empty Transaction Ratio



TFP: DFP Ratio



Artifact transaction count to Program count ratio




7. Transaction Completeness check 

Run the queries at Transactional Function Completeness page and attach the collected data.

Sl. No

Data points to collect

File Attachments

Comments

1

Excluded Data Entities



2

Deleted or ignored Transactional Functions



3

Deleted or ignored Data Functions



4

List of Data Functions



5

List of Transactional Functions



6

List of Transaction Entry Points that have been merged



7

List of Transaction Functions that have been adjusted



8

List of empty transaction Functions not ignored or deleted


Report containing list of empty transactions and their justifications

9

List of Data Entity candidates



10

List of Data Functions that have been merged



11

List of Data Functions that have been adjusted



12

List of ignored or deleted Data Functions




8. TCC configuration/results validation (QC4)

#

Checkpoint

Checked
(Yes/No)

Lead Validation  

Date

Checked By

Architect Validation 

Date

Checked By

1

Entry points, End points and Data functions mentioned in App Discovery have been configured in TCC




<Lead Trigram>



<Architect Trigram>

2

Are the additional entry/end point configurations valid?
For example: configurations imported from a standard tcc config file




<Lead Trigram>



<Architect Trigram>

3

Are the transactions with 0 FP count justified?




<Lead Trigram>



<Architect Trigram>

4

Are transaction samples of all unique flows identified in the App discovery documented in section 6 above
Verify that the exact same flow names and objects names mentioned in the consultant help document are used in the transaction samples section above.
Any deviations should be mentioned and justified.




<Lead Trigram>



<Architect Trigram>

5

Smell tests are run and the results are justified.




<Lead Trigram>



<Architect Trigram>

6

Big potatoes have been identified and addressed.




<Lead Trigram>




9. Backup


Action

Status

Comments/Details

1

Take backup of local, central and mngt schemas and update the details in section "Backup Details (Manual)"


Use below naming convention for the backup filename/comments

 [Schema name]_[ver]_aft_QC4_ddmmyyhhmm(24hrformat)

Note: Once onboarding is done. make sure this backup is cleaned up

          For Pilots projects: don't delete the backup

Backup Location:

           R:\Sources\[Appname]\Schema_Backup

10. Calibration Summary

If a standard calibration kit (i.e. data and transaction filter SQL query) is used, note the version used or the location from where it is picked. Please inform respective owner (PMT/SME) when there is a customization done on the kits


Standard Kit used

Details / Version of the kit


Customization applied

1

Calibration Kit




2

Configuration Kit




3

Analysis configuration kit (environment profile + advance JCL)





Config Check


Details/Explanations


1

List of manually Suppressed Transactions




2

List of manually Suppressed Data Functions




3

List of manually grouped Transactions




4

List of manually grouped Data Functions




5

Details of changes done in Standard calibration/configuration Kit
(Update the comments section on changes done: link to calibration kit page)





11. Calibration Validation (QC5)

#

Check Point

Checked

(Yes/No)

Assessment

Lead Trigram

1

Are the calibration parameters acceptable?

  1. Deleted/Ignored data/transaction functions
  2. Grouped data/transaction functions
  3. Split data/transaction functions





Template – Dashboard Validation Page

  1. Inventory Validation

If App Compare is not used, then retrieve the number of files received from DMT Package Content tab, else use query to retrieve both the details

File extensions

No. of files received 

No. of files analyzed by CAST 

Justify difference

Extension 1




Extension 2





2. Dashboard Validation -  Validation Summary

#

Validation Criteria

How

Validation Status

CAST Comments

1

Check and inspect critical diagnostics

Please use dashboard validation Excel to identify if all critical violation results are correct and there are no false positives. It’s a detailed check, you should validate 10-15% artifacts and add your remarks here.

Not Started


Check scope

Check the files provided with analyzed files visible in the dashboard

Not Started


Check Module content

Check whether all User Defined Modules and specific inclusions / exclusions by customer have been implemented in the dashboard

Not Started


Check advanced features (If requested by customer)

Check presence of XXL rules or CWE rules (Security Data Flow) as part of Quality Model if requested by the customer.

Not Started


Check availability of all dashboards

Check below pages to confirm Engineering Dashboard has been configured correctly:

    • Size Baseline
    • Application assessment level
    • Quality Model Drill down (violating source code should be visible)
    • TRI

 Not Started


Check for dashboard login credentials

Ensure LDAP logins are working in case LDAP is configured

 Not Started


7

Are the number and label of snapshots in line with the snapshot retention policy (removal old snapshots and consolidate)?

Use Quick Access page to validate the version number and name of snapshot. Make sure all User Defined Modules appear on quick access page for that snapshot

Not Started


8

Function Point in TCC and Dashboard

Match the count of Data and Transaction functions in TCC against the count in Dashboard. If numbers are not the same, please fix and generate another snapshot

Not Started


9

Customization results

If any customization (Assessment Model changes, Architecture Rules, new quality results) has been done for the application, the results in the dashboard should be validated. Refer "Analysis Execution" under section 3 of analysis log book for list of customized settings

Not Started



3. Delivery Reports

Attach all reports to be sent to the customer

Date Generated

Delivery report name

Attachment


 Preliminary Findings





4. Dashboard Validation Checklist (QC6)


#

Checkpoint

Checked (Yes/No)

Validation Result (OK/Actions Assigned)

Date updated

1

Source code discovery report & code validation

Compare the inventory between source code discovery report and Dashboard

Check the initial email from the application team /FO and make sure the scope is covered.



2

Diagnostic validation sheet

Ensure the Diag Validation is completed by the AIA who worked on the application.



3

Screenshots of TCC and Dashboard for FP

      1. The values for DFP and TFP should be the same between TCC and Dashboard.
      2. Modules LOC is showing up as expected




4

Engineering Dashboard and Analytics Dashboard validation

      • Check the TOP 5 TECH to ensure there are no 0 values.
      • Check if the System name, Application name and snapshot name are as per the service engagement sheet
      • Check if the Analytics Dashboard is linked with the Engineering Dashboard
      • Check all the critical violations on the dashboard.



5

Confluence Updates

Check if all the Confluence pages are updated



6

CMS Settings 

Check if all the jobs are active



7

Delivery Report

Verify all the delivery reports attached in the previous section




5. House Keeping Actions


#

Action

Status

1

Please complete the below housekeeping actions and update the status

      1. Backup of KB/CB/MNGT and update the details in section "Backup Details (Manual)"
      2. Source code is copied in Archive using automation
      3. Logs are backup using automation

Use below naming convention for the schema backup filename/comments

 [Schema name]_[ver]_aft_QC6_ddmmyyhhmm(24hrformat)

Schema Backup Location:

           R:\Sources\[Appname]\Schema_Backup

Logs and Source code Archive location:

           I:\Implementation\Archive\[Appname]\[VER]

                      Source Analyzed

                      DB Analyzed

                     Logs

2

Any new UAs or plugins developed, or enhanced? If so, publish to the Field Extension library



Template – Onboarding release audit checklist


    1. House Keeping Actions

      #

      Auditor

      Date

      Referenced

      Time taken

      Auditee

      Overall Feedback

      1

      <Auditor Name>

      <Date of Audit>

      <What references you used to conduct this Audit>

      <Time taken>

      <Auditee Name>

      <Go / No Go)

2. GAIC Quality Checklist - Onboarding

#

Quality Assurance Checks

Reference

Assessment

(Y/N/NA)

Comment

1

Is the Confluence tracking page up to date for this delivery and latest templates used?

Tracking page



2

Is Service Engagement Sheet / ADP available and all the details filled? Check Version number, Label, Date and Module split.

Check Service engagement sheet in DPT / ADP



3

Was the technical survey form / ADP provided by FO/ Customer?

Check Service engagement sheet in DPT



4

Has the source code accepted by architect using DMT?

Source code discovery



5

APP discovery was reviewed by peer and review passed? (QC1)

Architecture review checklist



6

App discovery was shared to customer/ FO and accepted by customer? Deemed acceptance in case the customer has not responded within the mentioned time frame?

Mail to customer



7

Handover meeting was conducted between Consultant, TL and Architect? In case of CIs / Pilot, solution design is involved?

Meeting invite



8

CMS setup validation done passed? (QC2)?

 Analysis log book



9

Analysis results Validation passed before proceeding to configuration? (QC3)

Analysis log book



10

Were there issues encountered in the analysis? All the open questions and issues are closed before Analysis?

Issues / Analysis log book



11

TCC configuration/results validation checks are done by lead and architect? (QC4)

AFP Configuration page



12

Enlighten diagrams created to display key transactions and mapped to flow pattern?

AFP Configuration page



13

Calibration Validation done by lead before generating a snapshot?

AFP Configuration page



14

Did AI Admin complete the dashboard validation using standard checklists and used applicable tools and the same is validated by Lead? (QC6)?

Dashboard Validation page



15

Are all the Zendesk closed? If there are any open Zendesk has the same been updated in limitations?

Sec 5 of AIC tracking page




Ensure that backup is available and update the details in section “Backup Details (Manual)”.   Incase if calibration is applicable, ensure that the backup is available before calibration is done.

Section “Backup Details (Manual)”



16

Has the Analytics Dashboard been updated? Is the link between the Analytics Dashboard and the Engineering Dashboard (using the microscope icon) working? Check for Engineering Dashboard in 8.0 above version?

Analytics Dashboard, Engineering Dashboard



17

Is the FP count in synch between the delivery report, the dashboard and TCC?

TCC, Engineering Dashboard, Delivery report



18

Is the source code visible on the dashboard?

Engineering Dashboard



19

Are you seeing any unexpected technologies while navigating on the dashboard and looking at "module LOC by technology"?

Engineering Dashboard



20

Have you checked that the TwRI figures show up on the transaction view on the dashboard?

Engineering Dashboard



21

Do the User Input Security Diagnostics appear on dashboard (if applicable)? Check for CWE related rules?

Engineering Dashboard



22

Was the standard template for the Delivery Report to onshore team send?

Refer Delivery Report Onboarding



23

Have all measurement limitations been described on the confluence page and on the delivery report? Look for open Zendesk / assumptions in App discovery document?

Delivery report



24

Is the delivery report complete and does it contain all the information - not missing any sections? In case of AT&T Empowerment reports are available?

a) AFP baseline

b) Health check report

c) List of rules

Empowerment reports



25

DLM Filters have been used?

CMS



26

Are all big potatoes addressed?

TCC



27

Reuse done?

Process / Automation / Knowledge





Backup Details (Manual)

#

Backup Stages

Backup name

(Mandatory)

Naming convention

Reason(Mandatory in case of intermediate backups)

Backup Location

(R:\Sources\[Appname]\Schema_Backup)

(Mandatory)

Comments

1

Onboarding - After QC3


 [Schema name]_[ver]_aft_QC3_ddmmyyhhmm(24hrformat)




2

Onboarding - After QC4


[Schema name]_[ver]_aft_QC4_ddmmyyhhmm(24hrformat)




3

Onboarding - After QC5(if applicable)


[Schema name]_[ver]_aft_QC5_ddmmyyhhmm(24hrformat)



this will not be applicable if Calibration is out of scope

4

Onboarding - After QC6


[Schema name]_[ver]_aft_QC6_ddmmyyhhmm(24hrformat)




5

Rescan[version] - Before Rescan kick start


[Schema name]_[ver]_bfr_rescan_ddmmyyhhmm(24hrformat)




6

Rescan[version] - Before calibration(if applicable)


[Schema name]_[ver]_bfr_calib_rescan_ddmmyyhhmm(24hrformat)




7

Rescan[version] - After final snapshot


[Schema name]_[ver]_aft_rescan_ddmmyyhhmm(24hrformat)