V4.0
Process to Onboard an Application
- Introduction
- Process Flow
- Qualification Phase
- Acceptance Phase
- Analysis Configuration Phase
- Transaction Configuration
- Calibration and Reconciliation
- Delivery Phase
- Troubleshooting Process
- Consumption Tools
- Appendix
Introduction
This document details CAST's Field Onboarding Process along with the checklist templates used.
Pre-requisites for on-boarding:
- CAST Application Intelligence Portal (AIP) is installed
- Supported Technologies/Frameworks
- Extension to be used if required
- Application Intelligence Administrator AIA has access to CAST Virtual Machine (VM)
The following topics are out of scope for this document:
- Rescan and rescan automation
- Unsupported technologies
- Clean-ups and retention
Target Audience
AIA
Application Manager
Process Flow
Related processes and checklist templates
- Qualification Phase:
- Template – Service Engagement Sheet
2. Acceptance Phase:
- Template - Source Discovery
- Architecture Discovery Process
- Template - Architecture Discovery Review checklist
3. Analysis Phase:
- Analysis Configuration Phase:
- Template - Analysis Log Book
- Transaction Configuration Phase
- Calibration Phase:
- Reconciliation Phase:
- Template – Automated Function Point (AFP) Configuration/Calibration
4. Delivery Phase:
- Template - Dashboard Validation Page
- Template - On Boarding Release Audit checklist
Qualification Phase
As a pre-condition of a CAST AIP analysis and to qualify the application from a technical and value perspective, CAST recommends gathering high level technical and non-technical information to help qualify the target application for the analysis. The technical qualification will be used to establish what level of "out of the box" support CAST has for the application, identify any show-stoppers such as the use of unsupported technologies or frameworks and any potential customization that may be required to accommodate exceptions.
Official Documentation
AIP Product Documentation for Application Qualification:
- Application Qualification process overview: http://doc.castsoftware.com/display/DOC82/1.3.+Application+Qualification
Entry Criteria:
- Application identified to be on-boarded onto CAST AIP
- Application Architecture Context Diagram (See appendix)
- Existing Application Detail Design documents if any e.g. Sequence Diagrams
- Access to Build Manager and/or Source Control Manager (SCM) details for source code and database connection details available
- Application Delivery Portal (ADP) / Technical Survey to be completed (See appendix)
Process Steps:
- Kick-Off meeting
- Meet with Application Team to understand architecture of the Application
- Confirm application boundaries and validate source code completeness
- Module Definition
- Update ADP / Technical Survey
- Technology info in ADP
- Verify app team contacts
- Verify Technology Details and Questions
- Verify if Architecture Context Diagram and Transaction Code Flow Document has been uploaded
- Front Office Qualified
- Front Office resources will validate above steps 1 and 2 are complete.
- Front Office resources will confirm using the Delivery Manager Tool (DMT) that source code delivery is complete including all dependencies.
- Front Office resources will complete the qualification and deliver the application to Back Office for next steps
- Setup Application Intelligence Center (AIC) Portal – Registering new domains and applications in the CAST AIC Portal. This step is performed to trigger the on-boarding and first-time analysis of a new application with CAST AIP
- Service Engagement Sheet – This sheet must be completed to define the scope of the Application
- DMT Package
- In case of extensions, ensure it is installed –http://doc.castsoftware.com/display/DOC82/Managing+extensions
- Package the source code using CAST DMT to prepare the code for analysis by CAST AIP. Refer to Deliver Application Source Code for more information.
- Deliver Application Source Code: http://doc.castsoftware.com/display/DOC82/1.4.+Deliver+the+application+source+code+version
- DMT Packaging Alerts:
http://doc.castsoftware.com/display/DOC82/DMT+-+Package+Content+tab+-+Packaging+alerts
Exit Criteria:
- All details and documents received and verified
- Complete application source code and database are delivered with dependencies
Deliverable Outcomes:
Application is configured in DMT.
Best Practices for DMT:
- Library files for Java technology, and assemblies for .NET technology (application specific and 3rd party framework) should be packaged separately. For further details refer to http://doc.castsoftware.com/display/DOC82/DMT+-+Folders
- Include Discoverer in DMT plugins
- Exclude test files/folders/projects and generated code - http://doc.castsoftware.com/display/DOC82/DMT+-+Use+Regular+Expressions
Acceptance Phase
This phase marks the source code delivery completeness from the Application team. It describes guidelines on delivering application source code, packaging source code with CAST DMT, validating alerts, completing the source code discovery before proceeding with analysis.
Official Documentation
AIP Product Support Documentation for Application Acceptance:
- Validate and Accept the Delivery:
http://doc.castsoftware.com/display/DOC82/2.1.1.++Validate+and+Accept+the+Delivery
Entry Criteria:
Qualification is completed
Process Steps:
- DMT Alerts
- Resolve ALL DMT errors, alerts and warnings per DMT Packaging Alerts
- Any alerts related to missing code need to be documented, and reported back to application teams for code delivery improvement
(http://doc.castsoftware.com/display/DOC82/Technology+Delivery+Instructions)
- Any alerts related to missing code need to be documented, and reported back to application teams for code delivery improvement
- Resolve ALL DMT errors, alerts and warnings per DMT Packaging Alerts
- Repackage the code.
- Repeat Steps a and b until all errors and warnings in the DMT log are resolved
- Source Code Accepted – Deliver the source code and set as Current Version
- Field Extension Step: Set the application Source Code Accepted Phase in ADP
- Application Discovery
- Architects review the source code and refers to the design documents provided by application Subject Matter Expert (SME).
- Initial dry analysis is completed in this phase.
- Complete the App Discovery and produce the Application Discovery Document & Consultant Help Document. Include discovery questions, if any, in the Discovery Document to cover the Functional view of the application
- App discovery Acceptance
- Peer Reviewer reviews documents for this phase and updates the checklists
- In case of 'Good to go', the App Discovery Document is submitted to the client. If not, it will go back to Step 3a
- Receive signoff of Application Discovery Document from the client completing the initial acceptance and verify the following items:
- Entry & Exit Points
- All the relevant flows
- Additional source code delivered (if any); any changes will go to Step 3a
- Handover to AIA – Conduct a handover meeting with Tech Lead/Consultant and Solution Design (SD) Team (if required based on the engagement/project and SD's involvement) to share knowledge on the application and discuss CAST constraints
Exit Criteria:
- Receive signoff from the Application Team
- Application Discovery Document completed
- Consultant Help document completed
Deliverable Outcomes:
- Approved Application Discovery Document
- Handover to AIA
Analysis Configuration Phase
The process below provides an overview of the Application Source Code Analysis process with CAST AIP. It consists of two phases: Analysis set-up and Analysis Execution. An optional, although strongly recommended Analysis Automation phase, may follow.
Official Documentation
AIP Product Support Documentation for Application Analysis:
- Application Analysis Process with CAST AIP: http://doc.castsoftware.com/display/DOC82/2.+Application+Analysis+Process+with+CAST+AIP
Entry Criteria:
- Availability of technical knowledge of configuration
- Application Discovery, Consultant Help Document, and Handover Document to understand application scope, technology, and extensions (if any)
- Current version should be set during the Acceptance Phase
- Source code acceptance - Error/alert-free packages and all the exclusions (test code, 3rd party frameworks, etc…) should be completed during DMT packaging
Process Steps:
- Set-Up Environment & Analysis configuration with respect to ADP and source code discovery
- Source Code "File Adjustments" (Optional)
- "Pre-process" Source Code (Optional) - In some situations, the source code cannot be parsed and must be pre-processed. A study should be done first to identify the pre-processing rules which must be applied to the source code. Add Pre-processor batch file 'Tools Before Analysis' under Content Enrichment tab
- Repackage DMT (Optional) if above steps are applicable – Application code must be repackaged to reflect changes
- Non-standard technology – In case the technology is not supported but in scope, do a feasibility study to support through a custom development; or do a post feasibility study to exclude it from the technology boundary
- If the application scope includes non-standard technology, install a supporting custom Extension as needed and import into the Assessment Model (optional)
- Verify Framework Settings
- Select correct versions of technologies and frameworks
- Setup Cast Management Studio (CMS)
- Choose an Environment Profile if needed
- Configure include/classpaths according to the technology
- Enable User Input Security if it is within the scope of the analysis
- Run CMS Analysis - http://doc.castsoftware.com/display/DOC82/CMS+-+Run+analysis+only
- Log validation – Perform analysis log validation
- Follow the troubleshooting process to remediate warning and errors using different platforms (if any)
- Document the log validation results: Number of alerts, type of alerts, and tickets raised (if any)
- Resolving warnings and errors
- If no warnings or errors exist at the end of Analysis, proceed to Step 8
- If warning or errors are found, return to Step 4 to correct them
- Results validation – Perform inventory validation
- If the delivered code matches the analyzed code (unless excluded because it is out of scope), proceed with following steps a, b, and c
- Validate Dynamic Links / Dynamic Link Rules creation – Document the number of Dynamic Links.
- Parameterization – Document the number of Parameterization Steps
- Module Definition
- Set up the module content according to the client's requirement
- If no recommendation, retain the Full Content
- In case of any mismatch, check if any paths have been missed in the analysis units. If yes, apart from below 4 steps, analysis configuration and analysis units must be re-visited
- Adding dependencies
- KB Update Assistant
- Ref Finder
- Return to Step 4
- If the delivered code matches the analyzed code (unless excluded because it is out of scope), proceed with following steps a, b, and c
Exit Criteria:
- Log validation
- Inventory validation
- Post-analysis validation
Deliverable Outcomes:
- Error-free analysis results along with inventory and log validation
Best Practices:
- Add Class Paths, Include Paths, Working Folders, and Header Files
- Select the appropriate Integrated Development Environment (IDE) for each technology
- Retrieve Macro list from source code
- Use only supported versions of extensions (Alpha or Beta not supported)
Transaction Configuration
The CAST Application Intelligence Platform enables users to measure Function Points from the source code of an application. CAST AIP measures the functional size of an application using Object Management Group (OMG)-compliant Automated Function Points.
However, Function Points is a functional size measure that is based on the application's specifications and requires knowledge of the intent of the application's designers ("primary intent" mentioned in the OMG-compliant Automated Function Points Counting manual). So, to make the measure possible, Function Point (FP) Experts and Consultants need to calibrate the initial automatic count made by CAST. Calibration includes removing technical and temporary objects from the list of Function Points counted, aggregating, and splitting several function points into one and changing the type of the Data or Transactional Functions.
Official Documentation
AIP Product Support Documentation for Transaction Configuration:
- How to configure TCC:
http://doc.castsoftware.com/display/DOC82/TCC+-+CAST+Transaction+Configuration+Center
Entry Criteria:
- Analysis results are validated, and a snapshot is available
- Transaction Configuration Kit is available
Process Steps:
- Deploy Transaction Kit – Standard Entry and End points will be available from the technology specific configuration kit. It is available for download from the following link (Link to download Calibration Kit)
- Entry point configuration / Data Entity configuration / End of transaction configuration
- Trace and save all the transactions (identified in Consultant Help Document) in Enlighten
Note: There should be a one-to-one mapping between the Enlighten Diagrams and the flows identified in the consultant help document. - Validate the empty transactions. Document any broken transactions and missing links in Confluence
- Create specific rules for transactions which have been identified in the Application Discovery Document but not configured in TCC through the Transaction Kit. These rules include configuration of Transaction Entry Points, End Points, and Data Entities
- Verify with the Technical Lead if the new transactions can be added
- If yes, communicate this to the Architect and get a sign off
- Document the transactions properly and provide a feedback in the standard AFP Configuration/Calibration page
- Trace and save all the transactions (identified in Consultant Help Document) in Enlighten
- Review Transactions
- Verify if there are artifacts with high Data Element Types (DET) / Record Element Types (RET) values. If there are then check the objects with issues and remediate them. If the issue still exists, raise a support ticket
- Check Data Production Tracker (DPT)/ADP to see if there are any recommendations for changing default FP values
- Check the code coverage i.e., validate that all artifacts are covered in the transactions
- If there are a few artifacts which do not belong to any transaction, check if they are in scope of module definition
- Generate and share the list of artifacts which are part of module definition but are not covered in transaction with the Tech Lead. Take the necessary remediation steps suggested by Tech Lead
- Share the list of artifacts with Architect obtained from step e
- If the code coverage is in an acceptable range, document the action details taken in the above steps
- Transaction Completeness
- If empty transactions exist, follow the steps below:
- If this is a configuration issue, go back to analysis configuration
- If this is a standard framework supported by CAST, raise a Support ticket
- Add links between objects that are not recognized by AIP out of the box through the Reference Pattern process, Knowledge Base (KB) update queries, and supported extensions to create missing objects/links
- Once the link is completed in Enlighten, compute in the Transaction Configuration Center (TCC), validate the flow, and continue with activity in Step 2
- Code coverage must be in the acceptable range
- If empty transactions exist, follow the steps below:
Exit Criteria:
- Document all the configuration and code coverage details
- Launch snapshot
Deliverable Outcomes:
- Transactions configured for the Application
Calibration and Reconciliation
The main aim of the CAST Transaction Configuration Center is to allow you to calibrate objects with regards to their Function Point size - in other words, you can choose how an object is interpreted by the CAST algorithm and thus how its Function Point size is derived (or you can manually set a Function Point size for an object), instead of having to use the default CAST configuration, which may not always be appropriate. This section provides information about how you can do this using the CAST Transaction Configuration Center.
Official Documentation
AIP Product Support Documentation for Transaction Calibration:
- Transaction Calibration:
http://doc.castsoftware.com/display/DOC82/TCC+-+Calibrate
Entry Criteria:
- TCC configuration is done
- Analysis results are validated
- Standard calibration kit is used
- New snapshot has been computed
Process Steps:
- Deploy Calibration Kit – download the latest function point calibration kit for your version of AIP from the CAST Extensions link. Apply the kit as described in the documentation. This will calibrate function point counts by removing or ignoring redundant transactions, and unifying duplicates.
- Rule based Ignore / Rule based Deletion / Rule based Grouping / Rule based Splitting – manually check the transactions in TCC for further calibration opportunities.
- Review the Entry & End Points list to avoid duplicate and redundant transactions
- Make sure the Data and Transaction filters are not ignoring, deleting, grouping or splitting wrong transactions. If there are such transactions, add them as exceptions by modifying the filter functions
- Rule based value / Rule based type
- Filter grouping rules based on naming, types, inheritance, and free definition
- Run a Preliminary Snapshot
- Manual Grouping / Manual Splitting / Manual type Adjustment / Manual FP Adjustment (Optional)
- If the transaction configuration requires further calibration, manually make adjustment
Exit Criteria:
- Document the configuration
- Launch a snapshot
Deliverable Outcomes:
- Transactions calibrated for the Application
Delivery Phase
The Delivery phase provides guidelines on how to run a snapshot, deploy dashboards, and hand over the results to the customers for consumption.
Official Documentation
- AIP Product Support Documentation for End User Guides (installing dashboards):
http://doc.castsoftware.com/display/DOC82/End+User+Guides
Entry Criteria:
- Function point configuration is completed
- Module definitions are properly configured in Analysis phase
- Scope and result expectations are verified
- If there are any specific credentials for Engineering Dashboard, it should be frozen during project phase
- Custom reports should be decided during the acceptance (if any)
Process Steps:
- Run Baseline snapshot
- UI validation
- Launch dashboard either via CMS or deployed Web Archive (WAR)
- Validate if Engineering Dashboard pages are displaying data correctly e.g., Transaction-wide Risk Index (TwRI) page, Risk Indicator pages, Critical Violation pages, Compare Version pages
- If there is any issue with the above step, follow the Troubleshooting Process
- Inventory Validation
- Check the Lines of Code (LoC) per modules and validate with the inventory
- Validate the number of files, classes, tables, and artifacts per technology. If the numbers are not as expected, follow the Troubleshooting Process
- If this check is for maintenance, validate the variation in data from Quick Access view
- If there are lot of unexpected added/deleted artifacts in the above step, follow the Troubleshooting Process to fix the issue. If the issue is due to folder structures, reject the delivery and start from DMT phase
- Engineering Dashboard Metrics validation:
- Check the number of critical violations and validate few of them to see if there are any false positives. If there is any false positive, follow the Troubleshooting Process
- Check the other metrics such as Technical Debt, Cyclomatic Complexity
- If this process is for maintenance, check the evolution
- FP count must match with FP count in TCC
- Analytics Dashboard Metrics validation:
- Consolidate Engineering Dashboard with Application Analytics Dashboard (Analytics Dashboard) if Engineering Dashboard can't be navigated to from Analytics Dashboard
- Validate that the numbers displayed on Analytics Dashboard must match with Engineering Dashboard
- Delivery Report Generation
- Delivery reports must be prepared once all the validations are completed
- Prepare and document custom reports in Confluence (if any)
- Maintenance documentation
- This document groups together all miscellaneous backup, optimization, and maintenance documentation related to the CAST Storage Service
- Automation
- This page describes how to install and use the CAST Analysis Automation Application Programming Interface (API) to automate the process of analysis with CAST AIP
Exit Criteria:
- All validations completed must be properly documented and the reports must be uploaded in Confluence
Deliverable Outcomes:
- Application onboarding completed.
Troubleshooting Process
Entry Criteria:
- Errors/warning/alerts are encountered in the CAST tools (DMT, CMS, Enlighten, TCC)
- Global Application Intelligence Center (GAIC) troubleshooting guide
Process Steps:
- When an error or warning is encountered:
- Refer to the GAIC troubleshooting guide for CAST tools which gives information about:
- Which errors/warnings/alerts can be ignored?
- Remediation, resolution, and workaround details
- Refer to Technical Knowledge Base (TKB) for known issues and limitations
- Refer to Zendesk for similar issues faced
- Raise a Support ticket
- Consult Product Management (PMT) or the Research and Development (R&D) team
- Consult SME if recommended by Support or PMT
- Refer to the GAIC troubleshooting guide for CAST tools which gives information about:
Exit Criteria:
- Verify if the above steps have solved the issue
Deliverable Outcomes:
- All errors, warnings and alerts are remediated
Best Practices:
- Refer to the Zendesk for Solution Delivery best practices
Consumption Tools
Engineering Dashboard
CAST Application Engineering Dashboard (AED) is the technical dashboard to review application analysis results. Engineering dashboard lets you drill down into the violations down to the violating object list and view non-compliant source code. Here are more details about application engineering dashboard.
http://doc.castsoftware.com/display/DOC82/CAST+Application+Engineering+Dashboard+-+CAST+AED
Analytics Dashboard
CAST Application Analytics Dashboard (AAD) is core AIP tool for consuming portfolio level analytical information, Target uses of AAD are portfolio management and application management group. AAD provides analytical information, trending and consolidated information of all applications in the portfolio. Here is the cookbook for various functionalities AAD offers
http://doc.castsoftware.com/display/DOC82/CAST+Application+Analytics+Dashboard+-+CAST+AAD
Report Generator
The CAST Report Generator is a standalone solution for automatic generation of reports based on CAST AIP results. The solution provides the opportunity (for example) to prepare and automate assessments for each new version of application analyzed with CAST AIP. Documents are based on Microsoft Office templates and they can be modified to prepare a specific template to meet a particular use case or to comply with a company format. After generation, the resulting document can be further adapted if necessary.
The Report Generator is based on the CAST RestAPI meaning that CAST AI Administrators and project managers alike can use the tool. Here is the link for more information on CAST Report Generator.
http://doc.castsoftware.com/display/DOCCOM/CAST+Report+Generator
Advanced Analytics Report (AAR)
Overview
Target audience - Business Analyst and Key User who are interested in using CAST results analytic as part of advanced data analysis reports that may blends CAST metrics and own data.
Summary - This document provides installation an deployment instruction of the back end solution and the configuration of automated data-feed and core set of standardized views. Report templates based on this solution may be available as separate filed extensions.
Key Benefits
Enabling a new way to consume CAST analytics - Organizations can use the existing talent base and tools to rapidly discover new business insights from CAST Analytics, build ad-hoc dashboard and report that better cater to specific needs and goal of IT management and business users. Create blended report that correlate and links internal metrics from internal ERP, financial and production monitoring systems to investigate date, exposes correlation and interdependency and inform fact based decision.
Efficiency with easy governance: no need for unnecessary ETL cycles and schema maintenance activities, but still ensure governance through easy-to-deploy granular access controls. Even the most complex AAR views can be developed and deployed to users without complex or lengthy data preparation and regardless of input data structural complexity and organization.
DATAPOND: built upon REST to create / update data tables in a report manner
Use Cases
- Self-service raw data exploration: You can explore and analyze CAST raw data sets of any complexity as they are produced, using familiar Business Intelligence (BI) tools. You can now use SQL to natively query and manipulate complex/semi-structured CAST Rest API JSON data. Blend Rest API query with traditional CAST storage and dashboard services SQL query to gain new insights and more effectively determine what is useful and what is not, without spending lots of IT cycle.
- Business intelligence on CAST analytics: Specialized ANSI SQL extensions allow for instant flattening and native querying of even the most complex nested data. You can perform secure, concurrent, low latency SQL analytics your IT data. You can do instant joins across CAST and your own data to create blended data report and dashboard that highlight correlation and interdependency between CAST analytics and your own business and operational metrics.
Description
AAR solution is built upon Apache Drill and is fundamentally a query engine that support a variety of NoSQL databases and file systems, including HBase, MongoDB, MapR-DB, HDFS, MapR-FS, Amazon S3, Azure Blob Storage, Google Cloud Storage, Swift, NAS and local files. A single query can join data from multiple datastores. For example, you can join Rest API JSON response, to text file output to a PSQL query to a Central Schema to a user profile collection in MongoDB.
Leveraging Drill JSON data model AAR support queries on complex/nested data as well as rapidly evolving structures. AAR/Drill provides intuitive extensions to SQL so that you can easily query complex data.
AAR queries results in virtual datasets (views) that can be mapped into BI-friendly structures which users can explore and visualize using their tool of choice. Business users, analysts and data scientists can use standard BI/analytics tools such as Tableau, Qlik, MicroStrategy, Spotfire, SAS and Excel to interact with CAST Rest API JSON response and/or SQL response (and any non-CAST non-relational datastores) by leveraging Drill's JDBC and ODBC drivers.
A standard set of core views that combine selected CAST outputs are deployed and configured as part of the initial installation and ready to use in any of the sported data visualization solution.
Cast2Jira
The CAST AIP Action Plan to Jira Jenkins plugin is designed to allow the user to export the contents of the CAST Engineering Portal (CEP) Action Plan to Jira as a Jira ticket. This plugin connects with CAST Database and extracts action items based on priority of the action items and uses Jira rest API for creating the tickets.
Below is the documentation to configure and use the plugin
Appendix
Template – Qualification Phase
Service Engagement Onboarding Template
Service engagement Sheet - V1.3
#
Service engagement question
1
Documents
· Technical Survey
· Others if available
2
Offshore deadline (specify the date)
3
Client presentation deadline (specify the date)
4
Split by Version – specify versions and sequence
5
Client constraints and background - support for estimation
· Explain constraints & background
· PS project for booking
6
Primary regional contact for front office validation and source code extraction support
7
Do you want us to publish the results to send the dumps via File Transfer Protocol (FTP)?
8
Source code
· Location
· Current account details
9
Analysis type
· Pilot
· Initial analysis service
· Recurring analysis service
11
Cast AIP version
12
Aggregation type
13
Do we need to enable Security Flow Analyzer (Y/N)
· Specify sanitation methods if available
14
Do we need to enable escalated links (Y/N)
15
User Designed Module (UDM) design (mark your choice and provide the basic)
· Technology split
· Functional Domain split
· Other, please specify
16
Transaction configuration
· Required Y/N
17
Snapshot labeling
· Default will be the execution date
18
Extra Extra Large (XXL) information
· Required Y/N
19
CAST Architecture Implementation
· Required Y/N
20
Diagnostics update required
· Specify what needs to be changed
21
Specific FO validation points that are important to know
22
Comments/other inputs
- Service Engagement Onboarding template
3. Technical Survey
Latest version is available at Technical Survey#Technical_Survey_LIGHT
Template – Acceptance Phase
Source discovery
Source code comparison checklist with drop site – The below checks must be performed on all delivered source files provided by the Application Team before configuring DMT packages
#
Check
OK?
Justification for Exceptions
1
Match the provided source code at drop site with the Technical survey and architecture document (look for the associated technology file ext.)
2
Map the high-level module mentioned in the architecture document with the provided source (one way is to look for matching folder names)
3
Cross check the framework related files (e.g. Struts) mentioned in the Technical Survey with the provided source
4
Folders with release names or branch names (like Trunk, dev, head, svn) should not be part of source delivery. If it they exist, cross check the actual structure of source code with Client (if required) or remove/rename before packaging
5
Folder name must not contain the version number as this will cause added/deleted objects in rescans. Folders should be renamed without version numbers and both folder structure and naming conventions must be maintained in all future rescans
6
In case of duplicate code (e.g. if project name is different but the code is same, or if there are any backup folders), verify findings with the Client. Duplicate code should be removed after client confirmation
7
Take backup of drop site source code at:
Source: R:\Sources\[App Name]\[Version]Source_Provided\Source
DB: R:\Sources\[App Name]\[Version]Source_Provided\DB
8
Verify that project files are delivered (e.g. csproj are delivered in .NET, project files or .pom files in java)
- Source code comparison checklist with DMT Package – Below checks must be performed during DMT packaging
# | Check | OK? | Justification for Exceptions |
1 | Cross check if the package for Drop Site root folder (containing the source) is selected. (DMT should be able to discover all technologies) | ||
2 | If SVN/TFS/Git: make sure only the source folder is selected and no release names or branch names (like Trunk, dev, head, svn). Also, cross check credential for SVN/TFS/GIT | ||
3 | Database check: If it is an offline extraction, check if all the schemas is present as mentioned in Technical Survey. If it is a live connection, select all schemas in DMT as mentioned in Technical Survey |
Onboarding Checklist
Ensure that the DMT source packaging has been configured correctly as per the technologies in the Application. For further details on how source code is delivered in DMT for each technology, refer to http://doc.castsoftware.com/display/DOC82/Technology+Delivery+Instructions
# | Check | Tools/ References | Result | |||||
1 | List of technologies as compared to technical survey or ADP |
| ||||||
---|---|---|---|---|---|---|---|---|
2 | Ensure all code not declared but delivered are reported in the App discovery document | |||||||
3 | Ensure all code declared but not delivered are reported in the App discovery document | |||||||
4 | Any un-supported technologies for the version of CAST be used. List them in Results if Found, reject the application if the technology is breaking transactions or intimate client that portion of the code will not be covered | Refer to supported technology section in Documentation | ||||||
5 | Are all the DMT alerts re-mediated? | |||||||
6 | Are DMT filters added to exclude test code? | |||||||
7 | Are DMT filters added to exclude not relevant file extensions? | Refer to Best practices for relevant technology (Ex: remove .exe, .so, .pch in C/C++) / Remove not relevant extension such as (.docx, xls, ...). | ||||||
8 | Are DMT filters added to exclude application team provided exclusions? | Refer ADP | ||||||
9 | Are the files without extensions justified? | Refer to package review tab in DMT for the count of such files | ||||||
10 | Are archive files(war/ear/tar/zip) delivered through DMT? | Refer to Best practices for relevant technology(unzip, look for source/config xml files, check with client, redeliver) | ||||||
11 | For XXL diagnostics use case : Check if *.sqltablesize files is delivered | Refer to the kick-off slide/service engagement sheet | ||||||
12 | Check with Application team how the Materialized views are used in application (if any)? |
Additional Checks for DMT | |||
---|---|---|---|
13 | Identification of middleware and reporting frameworks | Mention the usage of these specific 3rd party module to the AIA to configure such entities correctly. An extension can be leveraged to cover 3rd party elements such as BRMS, SOA, JasperReport, BIRT, etc. | |
14 | Identification of 3rd party JavaScript libraries | List the standard files of third party JavaScript libraries. Mention the presence of 3rd party JavaScript libraries to the AIA and AIA should ensure that these files are analyzed but excluded from the modules. | |
15 | Search the delivered code to discover some expected technology, and then confirm if those technologies have been identified during the App Discovery phase. | Execute the smell tests. Example: For any Mainframe application, if database has not been delivered, make sure to check the delivered source code for DCLGEN statements. If DCLGEN statement is present, it implies that the application uses a database which has not been delivered. And the next step would be to reach out to the Application team for the database. | |
16 | Has initial analysis been completed with the packaged source code? | Initial Analysis (Out of the box analysis) - After setting the current version, install the required Technology/Framework extension, Configuration/Calibration Kits and launch Analysis +snapshot. |
3. DMT report
Attach the report generated by DMT which lists file counts and total size per extension. The information from DMT can be used to fill the below table
Extension | File Count | Size | Technology | Should be analyzed? (Yes/No) | Justification if No |
4. Technology & Framework Analysis Approach
Following approaches can be implemented to configure an application for analysis:
- CAST AIP - Analysis supported by out of the box CAST analyzers
- Product Extension - Analysis using extensions listed at http://doc.castsoftware.com/display/DOCEXT/CAST+AIP+Extensions+Documentation
- Field Extension - Analysis using non-standard technologies. Refer to extensions listed at CAST Extend
- Custom configuration - Custom extension, Environment Profile built depending on specific application
- Technology not supported for analysis
Supported Technologies can be found at http://doc.castsoftware.com/display/DOC82/Supported+Technologies
5. Source Code Validation Summary
# | Check Point | Trigram (Role) | Validation Result (Accepted / Rejected) | Date Updated |
1 | Are all the above checks completed and is the source code ready be accepted for analysis? | <In case of rejection, email needs to be sent to FO> |
6. Architecture Discovery Process
This section is detailed in the presentation below (Architects_User_Document.pdf).
7. Architecture Discovery Review checklist
Quality check to ensure the completeness and correctness of the Architecture Context Document
Transaction Flows | Yes/No/NA | Comments | ||
1 | Check if all the technologies mentioned in Technical Survey are referred in the Application Architecture Overview | <Do not leave this blank> | ||
2 | Check if all the file types delivered as part of code base are covered in Application Discovery | <Do not leave this blank> | ||
3 | Check if all the un-analyzed file types are explicitly mentioned in the application discovery | <Do not leave this blank> | ||
4 | Check if the Application Context Diagram/Transaction flows given by the client corresponds to the source delivered through DMT | <Do not leave this blank> | ||
5 | Check if the frameworks identified in the application are referred in the Application Architecture Overview | <Do not leave this blank> | ||
6 | Check for the validity of any new transaction flow discovered by the Architect | <Do not leave this blank> | ||
7 | Check if the diagrams adhere to the standards described for the project | <Do not leave this blank> | ||
8 | All unique flows should have corresponding sample in the Consultant Help Document | <Do not leave this blank> | ||
Entry Point/Exit Points | ||||
9 | Check for standard Entry/Exit Points for the framework identified are included in the App Discovery. | <Do not leave this blank> | ||
10 | Check if all the Entry/Exit Points identified in the Transaction Flows are included in the App Discovery | <Do not leave this blank> | ||
11 | Check for the validity of new Entry/Exit Points discovered by the Architect | <Do not leave this blank> | ||
12 | The Entry/Exit points should be well defined and clear to consultant/Lead | <Do not leave this blank> | ||
Assumptions and Open Questions | ||||
13 | Check if the Assumptions and Open Questions are valid and can be understood by the client | <Do not leave this blank> | ||
14 | Check if all the open questions are assigned an assumption | <Do not leave this blank> | ||
Reviewed By | ||||
Approved (Yes/No) | Date | |||
< Trigram > |
Template – Analysis Configuration Phase
Analysis Log Book
- Open Issues/Questions
This section contains question/issue from the Source discovery phase till the analysis is completed. AIA working on the analysis has the responsibility to log the questions and update status whenever there is a resolution / response.
Risk in below table can be one of the following:
- Blocking - Something we cannot proceed with
- Critical - We can move but will impact the analysis results
- Low - Can be ignored - little impact of results but need confirmation
# | Issue /Question | Reporting Date | Risk | Impact | Action On | Resolution | Resolution Date | Status |
---|---|---|---|---|---|---|---|---|
1 | ||||||||
2 |
2. Application Discovery Handover
For application architecture details, refer to the Application Discovery document, Consultant Help Document, and Client approval email in the AIC tracking page for the application
Date | Checklist | |
---|---|---|
1 | Template - Architecture Discovery Hand Over Meeting Checklist |
3. Pre-analysis Checklist
Below mandatory checks should be performed before and after an analysis for all versions. Once checks are done, state status and add comments when needed.
# | Checkpoints | Checked(Yes/No/NA) | Comment |
---|---|---|---|
1 | License from the customer to be applied. | ||
2 | Correct Application name and version to be added. DMT version date should be source code delivery date. | ||
3 | Check log file, LTSA and LISA location setting inside CAST MS preferences. platform settings. | ||
4 | Automation Framework (AOP-Jenkins): Ensure the application is configured in AOP. | <Check with Lead/DM for details> | |
6 | Make sure that all the files discovered, and Technology/framework discovered are configured in CMS. | ||
7 | All frameworks should be identified, and versions should match the code. | ||
8 | All the required extensions as per app discovery are installed and configured. | ||
9 | Check technology specific settings like classpath/working folder for J2EE, include path for C++ and working folder for Mainframe jobs are configured | ||
10 | Add dependencies for frameworks/technologies | ||
11 | Add default dynamic link rules | production tab | |
12 | Add required content enrichment jobs as per discovery document | ||
13 | User input security to be activated and configured [black boxing, sanitization method, entry point methods] only when there is a requirement from Client | ||
14 | If XXL table info has been delivered, make sure the server name as well the schema name matches with schema analyzed. | ||
15 | Architecture rules should be defined as per the inputs from architects | ||
16 | Default module configuration to be applied unless there a specific requirement from client. |
4. CMS setup validation (QC2)
S.No |
| Comments | Assessment |
1 | CMS preferences(log/LISA/LTSA/Delivery/Deploy) are aligned with the best practices and project specific environment settings | ||
2 | All the files and extensions retrieved by DMT are referred in CMS | (Proceed or rework) | |
3 | All the frameworks with right versions and technologies defined in the App discovery document are configured in CMS | ||
4 | Any CAST extension required to analyze an application is installed | ||
5 | Apply technology content enrichment jobs, dependencies, DLM rules, Architecture rules | ||
6 | Confirm if the server configurations (Hard disk space, RAM, Free CPU) are sufficient for the volume of the code being analyzed | ||
7 | Verify the System Name, application name, version, label in the analysis settings in CMS. |
* Analysis can be started only when QC2 is passed.
5. Assessment Model Settings (based on customer requirement)
# | Check | Answer |
1 | What is the consolidation type of assessment model? | |
2 | Any requirement to disable the metrics? If Yes, please specify what all metrics are disabled. | |
3 | Any requirement to change diagnostic parameter/weight/threshold? If Yes, please specify the changes. | |
4 | Any requirement of creating any tech criteria/ health factor metrics? If Yes, please specify | |
5 | Any requirement to change in the critical flag? If Yes, please specify the changes | |
6 | If any of the Extensions used for the app contains quality rules, check that the quality rules are present in the application assessment model |
6. Analysis Details
| Source Code Preparation | Details / Screen Shots | Comments / Details |
1 | Database Code, any custom change in the Instance name or schema name (Provide the details of Instance and schema names configured and how it is taken care of in the automated re-scans) | ||
2 | Code - What organizational steps are taken to convert delivered source code into analyzable source code? Is it compatible with automated re-scans? | Pre-process the code to an analyzable format | |
3 | Code - Test code removal actions for all layers including database Should be done in DMT to handle automated re-scans | ||
4 | Code - Any Pre-processing done on the source code should be documented and added as a pre-analysis job in CMS | ||
| Current version (Analysis Units) |
| Custom/User defined analysis units |
5 | Create relevant analysis units if it is not created by DMT | ||
| Analysis Tab |
|
|
6 | Verify/Add extensions for the files which are not covered by the default setup | ||
7 | Ensure the following are configured: Working Folder/Classpath/Includepath as appropriate |
| |
8 | Create/import required custom environment profiles. | ||
| Dependencies |
|
|
9 | Verify/Add dependencies for analysis units and Reference patterns/ Refine targets | ||
| Production Tab |
|
|
10 | Configure additional DLM Rules to validate/invalidate dynamic links as required (Provide details in Section 6 below) | ||
| User Input Security (only if required) |
|
|
11 | Setup the user input security parameters [user input methods, Blackboxing, target methods] | ||
| Content Enrichment |
| Explain the relevance of below jobs with Enlighten Diagrams |
12 | "Tools before Analysis" jobs | Add explanation for all the jobs | |
13 | "Tools after analysis" jobs | Add explanation for all SQL jobs/UI here | |
14 | Import all relevant enrichment jobs | Any Sql queries for marking artifacts as external? | |
15 | UDM Configurations | Recommended to use default module configuration unless there is a specific requirement | |
16 | Import all relevant SQL jobs for post module generation | ||
| Analysis Execution |
|
|
17 | Were any crash/errors/critical warnings observed during analysis? If yes, what is the action taken to resolve it? |
7. Log Validation
In case of multiple runs, please make sure to update the last run details
Rationale: CAST Analyzer reports warning and error through the generated .castlog files. However, log files could be tedious to read. Log manager inserts all entries in a table, so instead of reading the logs, we can count the number of most important messages in that table. A difference in the count between V2 and V1 (previous) is an indication of the validity of the analysis. If source code size increased by a little (LoC), and count of error messages increased a lot, this is a gap which must be investigated, and is worth investing time on. If increase in source code size and number of error messages are in line, then this is a waiver to bypass precise checks, and hence save time during the reanalysis.
Summary of Warnings/Errors
| Name of Error/Warning | Root Cause | Action taken | Reported to error documentation library/TGU (Yes / No) |
---|---|---|---|---|
1 | ||||
2 |
8. DLM Validation
In case of multiple runs, please make sure to update the last run details. Use Analysis Operation Portal (AOP) to obtain the DLM report to verify DLMs
Topic | Answer (Yes/No) | Justification Comments | |
---|---|---|---|
1 | Do you see lot of links in unexpected caller and callee combinations? Example: Calls from eFile to Database? | ||
2 | If there is database, is there a case of no DLM from application to database? You should investigate if this is the case. | ||
3 | DLM parameterization and DLM rules details. |
9. Boundary Definition
In case of multiple runs, please make sure to update the last run details
Boundary Actions | Action Details | |
---|---|---|
1 | Was any exclusion implemented? If yes, add the list of exclusions made. | |
2 | Provide the list of schema/DB analyzed in the application | |
3 | Are all the files discovered by DMT analyzed and are all covered in module(s)? If not, please provide the justification. | |
4 | Provide the details of module definition and list here. |
10. Analysis results validation (QC3)
Check Point | Checked (Yes/No) | Validation Result (OK/Actions Assigned) | Date updated | |
---|---|---|---|---|
1 | All files discovered by DMT are analyzed and they are part of the module(s). If not, check if proper justification is provided. Follow Consistency Review checks. | |||
2 | All technologies and frameworks mentioned by the architects in discovery document are properly setup in CMS and corresponding results are validated | |||
3 | Review the links created by customization. Make sure no irrelevant links are created between technologies/framework | |||
4 | DLMs have been reviewed and parameterization or rules have been added | |||
5 | Analysis Configuration has been validated by Lead to make sure it is healthy and all the checks in the page are confirmed and documented | |||
6 | Module content validation: All artifacts are a part of module content. If some artifacts/files are missing, provide justification | |||
7 | All database subsets have been removed from the analysis | |||
8 | Analysis execution validation and Automation Validation completed. This includes check of log files and validation of automation to ensure execution has gone without issues. | | ||
9 | Consolidation level is set as per the service engagement sheet. If not, it should be with product default. | |||
10 | Logs validation has been completed. |
11. Backup
Action | status | Comments/Details | |
---|---|---|---|
1 | Take Backup of local, central and mngt schemas and update the details in section "Backup Details (Manual)" | Use below naming convention for the backup filename/comments [Schema name]_[ver]_aft_QC3_ddmmyyhhmm(24hrformat) Note: Once onboarding is completed, make sure this backup is cleaned up For Pilots projects: don't delete the backup Backup Location: R:\Sources\[Appname]\Schema_Backup |
Template – AFP Configuration/Calibration Phase
Entry Points
#
Type of entry point
object Type
Description*
1
Ex: jsp
Why this choice?
*For standard Entry points, mention as "Standard"
- End Points / Data Function
# | Type of endpoint | Object Type | DET (8.0 and above) | RET (Only applicable for Data Function) | Contribution | Description |
1 | Ex: CFT file | DataFunction/Exit point what is its role in the application? | Why this choice? |
3. Specific Links added / Content Enrichment
# | type | Name | Details here (Explain if the pattern is a standard one and can be used as an extension) | Link to standard Script/Query used (if any) |
1 | Update Cast Knowledge Base | |||
2 | Reference Finder | |||
3 | Universal Importer |
4. Transaction sample
# | Name | Enlighten Print | Flow Pattern ID | Enlighten file name |
1 | Ex: JSP to Database | <Enlighten screen shot> | Flow_1_JSP_TO_DB | jsp_to_db |
2 | Ex: JSP to Database via procedure | Flow_2_JSP_TO_DB | jsp_to_proc_to_db |
*Order, Flow Pattern ID, and Objects in Enlighten Print should match with Architect document (Consultant help document)
5. Analysis Execution
S.No. | Check Point | Details / Screen Shots | Comments / Details |
1 | Were there any large SCC groups identified in the analysis results? Have they been handled? Check the DssEngine*log.txt file which is generated during snapshot for the application and look for the following section Also, high DET & FTR values for transactions are an indication of existence of SCC groups Refer to the troubleshooting guide in official documentation for steps to resolve this condition | Provide the steps that were taken to remove the condition |
6. Smell tests
Refer to the documentation to get details of the smell tests (https://doc.castsoftware.com/display/FBP/Offline+Smell+Test+-+OST):
Smell Test | Test Result (Red/Amber/Green) | Details and justification |
---|---|---|
Artifact Coverage Ratio | ||
Empty Transaction Ratio | ||
TFP: DFP Ratio | ||
Artifact transaction count to Program count ratio |
7. Transaction Completeness check
Run the queries at Transactional Function Completeness page and attach the collected data.
Sl. No | Data points to collect | File Attachments | Comments |
---|---|---|---|
1 | Excluded Data Entities | ||
2 | Deleted or ignored Transactional Functions | ||
3 | Deleted or ignored Data Functions | ||
4 | List of Data Functions | ||
5 | List of Transactional Functions | ||
6 | List of Transaction Entry Points that have been merged | ||
7 | List of Transaction Functions that have been adjusted | ||
8 | List of empty transaction Functions not ignored or deleted | Report containing list of empty transactions and their justifications | |
9 | List of Data Entity candidates | ||
10 | List of Data Functions that have been merged | ||
11 | List of Data Functions that have been adjusted | ||
12 | List of ignored or deleted Data Functions |
8. TCC configuration/results validation (QC4)
# | Checkpoint | Checked | Lead Validation | Date | Checked By | Architect Validation | Date | Checked By |
1 | Entry points, End points and Data functions mentioned in App Discovery have been configured in TCC | <Lead Trigram> | <Architect Trigram> | |||||
2 | Are the additional entry/end point configurations valid? | <Lead Trigram> | <Architect Trigram> | |||||
3 | Are the transactions with 0 FP count justified? | <Lead Trigram> | <Architect Trigram> | |||||
4 | Are transaction samples of all unique flows identified in the App discovery documented in section 6 above | <Lead Trigram> | <Architect Trigram> | |||||
5 | Smell tests are run and the results are justified. | <Lead Trigram> | <Architect Trigram> | |||||
6 | Big potatoes have been identified and addressed. | <Lead Trigram> |
9. Backup
Action | Status | Comments/Details | |
---|---|---|---|
1 | Take backup of local, central and mngt schemas and update the details in section "Backup Details (Manual)" | Use below naming convention for the backup filename/comments [Schema name]_[ver]_aft_QC4_ddmmyyhhmm(24hrformat) Note: Once onboarding is done. make sure this backup is cleaned up For Pilots projects: don't delete the backup Backup Location: R:\Sources\[Appname]\Schema_Backup |
10. Calibration Summary
If a standard calibration kit (i.e. data and transaction filter SQL query) is used, note the version used or the location from where it is picked. Please inform respective owner (PMT/SME) when there is a customization done on the kits
Standard Kit used | Details / Version of the kit | Customization applied | ||
---|---|---|---|---|
1 | Calibration Kit | |||
2 | Configuration Kit | |||
3 | Analysis configuration kit (environment profile + advance JCL) | |||
Config Check | Details/Explanations | |||
1 | List of manually Suppressed Transactions | |||
2 | List of manually Suppressed Data Functions | |||
3 | List of manually grouped Transactions | |||
4 | List of manually grouped Data Functions | |||
5 | Details of changes done in Standard calibration/configuration Kit |
11. Calibration Validation (QC5)
# | Check Point | Checked (Yes/No) | Assessment | Lead Trigram |
1 | Are the calibration parameters acceptable?
|
Template – Dashboard Validation Page
- Inventory Validation
If App Compare is not used, then retrieve the number of files received from DMT Package Content tab, else use query to retrieve both the details
File extensions | No. of files received | No. of files analyzed by CAST | Justify difference |
---|---|---|---|
Extension 1 | |||
Extension 2 |
2. Dashboard Validation - Validation Summary
# | Validation Criteria | How | Validation Status | CAST Comments |
1 | Check and inspect critical diagnostics | Please use dashboard validation Excel to identify if all critical violation results are correct and there are no false positives. It’s a detailed check, you should validate 10-15% artifacts and add your remarks here. | Not Started | |
2 | Check scope | Check the files provided with analyzed files visible in the dashboard | Not Started | |
3 | Check Module content | Check whether all User Defined Modules and specific inclusions / exclusions by customer have been implemented in the dashboard | Not Started | |
4 | Check advanced features (If requested by customer) | Check presence of XXL rules or CWE rules (Security Data Flow) as part of Quality Model if requested by the customer. | Not Started | |
5 | Check availability of all dashboards | Check below pages to confirm Engineering Dashboard has been configured correctly:
| Not Started | |
6 | Check for dashboard login credentials | Ensure LDAP logins are working in case LDAP is configured | Not Started | |
7 | Are the number and label of snapshots in line with the snapshot retention policy (removal old snapshots and consolidate)? | Use Quick Access page to validate the version number and name of snapshot. Make sure all User Defined Modules appear on quick access page for that snapshot | Not Started | |
8 | Function Point in TCC and Dashboard | Match the count of Data and Transaction functions in TCC against the count in Dashboard. If numbers are not the same, please fix and generate another snapshot | Not Started | |
9 | Customization results | If any customization (Assessment Model changes, Architecture Rules, new quality results) has been done for the application, the results in the dashboard should be validated. Refer "Analysis Execution" under section 3 of analysis log book for list of customized settings | Not Started |
3. Delivery Reports
Attach all reports to be sent to the customer
Date Generated | Delivery report name | Attachment |
Preliminary Findings | ||
4. Dashboard Validation Checklist (QC6)
# | Checkpoint | Checked (Yes/No) | Validation Result (OK/Actions Assigned) | Date updated |
1 | Source code discovery report & code validation | Compare the inventory between source code discovery report and Dashboard Check the initial email from the application team /FO and make sure the scope is covered. | ||
2 | Diagnostic validation sheet | Ensure the Diag Validation is completed by the AIA who worked on the application. | ||
3 | Screenshots of TCC and Dashboard for FP |
| ||
4 | Engineering Dashboard and Analytics Dashboard validation |
| ||
5 | Confluence Updates | Check if all the Confluence pages are updated | ||
6 | CMS Settings | Check if all the jobs are active | ||
7 | Delivery Report | Verify all the delivery reports attached in the previous section |
5. House Keeping Actions
# | Action | Status |
1 | Please complete the below housekeeping actions and update the status
| Use below naming convention for the schema backup filename/comments [Schema name]_[ver]_aft_QC6_ddmmyyhhmm(24hrformat) Schema Backup Location: R:\Sources\[Appname]\Schema_Backup Logs and Source code Archive location: I:\Implementation\Archive\[Appname]\[VER] Source Analyzed DB Analyzed Logs |
2 | Any new UAs or plugins developed, or enhanced? If so, publish to the Field Extension library |
Template – Onboarding release audit checklist
House Keeping Actions
#
Auditor
Date
Referenced
Time taken
Auditee
Overall Feedback
1
<Auditor Name>
<Date of Audit>
<What references you used to conduct this Audit>
<Time taken>
<Auditee Name>
<Go / No Go)
2. GAIC Quality Checklist - Onboarding
# | Quality Assurance Checks | Reference | Assessment (Y/N/NA) | Comment |
1 | Is the Confluence tracking page up to date for this delivery and latest templates used? | Tracking page | ||
2 | Is Service Engagement Sheet / ADP available and all the details filled? Check Version number, Label, Date and Module split. | Check Service engagement sheet in DPT / ADP | ||
3 | Was the technical survey form / ADP provided by FO/ Customer? | Check Service engagement sheet in DPT | ||
4 | Has the source code accepted by architect using DMT? | Source code discovery | ||
5 | APP discovery was reviewed by peer and review passed? (QC1) | Architecture review checklist | ||
6 | App discovery was shared to customer/ FO and accepted by customer? Deemed acceptance in case the customer has not responded within the mentioned time frame? | Mail to customer | ||
7 | Handover meeting was conducted between Consultant, TL and Architect? In case of CIs / Pilot, solution design is involved? | Meeting invite | ||
8 | CMS setup validation done passed? (QC2)? | Analysis log book | ||
9 | Analysis results Validation passed before proceeding to configuration? (QC3) | Analysis log book | ||
10 | Were there issues encountered in the analysis? All the open questions and issues are closed before Analysis? | Issues / Analysis log book | ||
11 | TCC configuration/results validation checks are done by lead and architect? (QC4) | AFP Configuration page | ||
12 | Enlighten diagrams created to display key transactions and mapped to flow pattern? | AFP Configuration page | ||
13 | Calibration Validation done by lead before generating a snapshot? | AFP Configuration page | ||
14 | Did AI Admin complete the dashboard validation using standard checklists and used applicable tools and the same is validated by Lead? (QC6)? | Dashboard Validation page | ||
15 | Are all the Zendesk closed? If there are any open Zendesk has the same been updated in limitations? | Sec 5 of AIC tracking page | ||
Ensure that backup is available and update the details in section “Backup Details (Manual)”. Incase if calibration is applicable, ensure that the backup is available before calibration is done. | Section “Backup Details (Manual)” | |||
16 | Has the Analytics Dashboard been updated? Is the link between the Analytics Dashboard and the Engineering Dashboard (using the microscope icon) working? Check for Engineering Dashboard in 8.0 above version? | Analytics Dashboard, Engineering Dashboard | ||
17 | Is the FP count in synch between the delivery report, the dashboard and TCC? | TCC, Engineering Dashboard, Delivery report | ||
18 | Is the source code visible on the dashboard? | Engineering Dashboard | ||
19 | Are you seeing any unexpected technologies while navigating on the dashboard and looking at "module LOC by technology"? | Engineering Dashboard | ||
20 | Have you checked that the TwRI figures show up on the transaction view on the dashboard? | Engineering Dashboard | ||
21 | Do the User Input Security Diagnostics appear on dashboard (if applicable)? Check for CWE related rules? | Engineering Dashboard | ||
22 | Was the standard template for the Delivery Report to onshore team send? | Refer Delivery Report Onboarding | ||
23 | Have all measurement limitations been described on the confluence page and on the delivery report? Look for open Zendesk / assumptions in App discovery document? | Delivery report | ||
24 | Is the delivery report complete and does it contain all the information - not missing any sections? In case of AT&T Empowerment reports are available? a) AFP baseline b) Health check report c) List of rules | Empowerment reports | ||
25 | DLM Filters have been used? | CMS | ||
26 | Are all big potatoes addressed? | TCC | ||
27 | Reuse done? | Process / Automation / Knowledge |
Backup Details (Manual)
# | Backup Stages | Backup name (Mandatory) | Naming convention | Reason(Mandatory in case of intermediate backups) | Backup Location (R:\Sources\[Appname]\Schema_Backup) (Mandatory) | Comments |
1 | Onboarding - After QC3 | [Schema name]_[ver]_aft_QC3_ddmmyyhhmm(24hrformat) | ||||
2 | Onboarding - After QC4 | [Schema name]_[ver]_aft_QC4_ddmmyyhhmm(24hrformat) | ||||
3 | Onboarding - After QC5(if applicable) | [Schema name]_[ver]_aft_QC5_ddmmyyhhmm(24hrformat) | this will not be applicable if Calibration is out of scope | |||
4 | Onboarding - After QC6 | [Schema name]_[ver]_aft_QC6_ddmmyyhhmm(24hrformat) | ||||
5 | Rescan[version] - Before Rescan kick start | [Schema name]_[ver]_bfr_rescan_ddmmyyhhmm(24hrformat) | ||||
6 | Rescan[version] - Before calibration(if applicable) | [Schema name]_[ver]_bfr_calib_rescan_ddmmyyhhmm(24hrformat) | ||||
7 | Rescan[version] - After final snapshot | [Schema name]_[ver]_aft_rescan_ddmmyyhhmm(24hrformat) |