Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  1. Gaps Addressed:
    1. Check whether the open questions as identified in the transaction report have been addressed by the customer?
    2. Ensure that the missing software components are delivered by the application team
    3. In case of any gaps that are still open, contact the application team to receive the inputs Or take an Informed exception
    If yes, handover to AIA: The Expert has to handover all the application related information along with the relevant report (Rapid Discovery Report)
  2. Repackage DMT
    1. Reject the delivery in CMS if new software components are provided by the application team
    2. Include the additional software components and repackage as per the DMT packaging guidelines
    3. Best Practices for DMT:

      1. Library files for Java technology, and assemblies for .NET technology (application-specific and 3rd party framework) should be packaged separately. For further details refer to http://doc.castsoftware.com/display/DOC83/DMT+-+Folders
      2. Exclude test files/folders/projects and generated code - http://doc.castsoftware.com/display/DOC83/DMT+-+Use+Regular+Expressions
      3. Check that the packaging is done as per DMT best practices
    4. Remediate DMT Packaging Alerts: https://doc.castsoftware.com/display/DOC83/How+do+I+handle+DMT+delivery+alerts
    5. Accept the delivery
  3. Modify CMS Setup, Run CMS Analysis:
    1. Follow all the steps that identified as applicable in step-6 (Setup CMS, Run analysis and snapshot) of the Discover phase
    2. Setup Environment & Analysis configuration 

      1. Source code “File Adjustments” (Optional)

      2. “Preprocess” Source Code (Optional). In some situations, the source code cannot be parsed and must be preprocessed. A study should be done first to identify the preprocessing rules which must be applied to the source code. Add Preprocessor batch file ‘Tools Before Analysis’ under Content Enrichment tab

    3. Verify Framework Settings

      1. Ensure that the correct versions of technologies and frameworks are selected

    4. Adding dependencies – add the dependencies between analysis units (if required)

    5. Fine-tune CMS settings

      1. Choose an Environment Profile if needed (Only Product supported profiles)

      2. Configure include/classpaths according to the technology

      3. Enable User Input Security if it is within the scope of the analysis

  4. KB Inventory Validation (FAEE / FAUC):
    1. Check if all the technologies/frameworks have been analyzed properly
    2. Check if all the files have been analyzed properly ( Files in DMT vs files in KB )
  5. Log Validation (PRAL / FAEM):
    1. Follow the troubleshooting process to remediate warning and errors by referring to product documentation available at https://doc.castsoftware.com/display/DOC83/Validate+analysis+based+on+log+messages
    2. Document the log validation results: Number of alerts, type of alerts, and tickets raised (if any)
  6. Review Dynamic Links:
    1. Create application-specific DLM rules to resolve the dynamic links
    2. Guidelines to validate the dynamic links are available in https://doc.castsoftware.com/display/DOC83/Validate+Dynamic+Links
  7. Module Configuration (FAEO / FAEC):
    1. Set up the module content according to the client’s requirement
    2. If there is no recommendation, retain the Full Content
    3. Guidelines to create user-defined modules are available in https://doc.castsoftware.com/pages/viewpage.action?pageId=200409226
  8. Health Measures Configured (FAEN):

    1. Check if the health factors have been qualified properly
  9. Architecture Checker Flow:
    1. If the app team has provided the information then configure the Architecture Flows
  10. Security Configuration:
    1. Validate the security settings and results for JEE and .Net applications if it is in scope
  11. Transaction Configuration (FAEA):

    1. Entry point configuration / Data Entity configuration / End of transaction configuration

      1. Trace the transactions based on the entry and endpoints given rapid discovery report

      2. Validate the empty transactions (FAET report will help in the identification of missing links). Document any broken transactions and missing links in Confluence

      3. Create specific entry/endpoints for transactions which have not been identified in the Rapid Discovery Report 

    2. Review Transactions

      1. Verify if there are artifacts with high Data Element Types (DET) / Record Element Types (RET) values. If there are then check the objects with issues and remediate them. Use FAES report to identify large SCC group. (If the issue still exists, raise a support ticket)

      2. Check the code coverage i.e., validate that all artefacts are covered in the transactions. Use FAER to validate the artefact coverage

      3. If there are a few artefacts which do not belong to any transaction, investigate if they are in the scope of module definition

  12. Transaction Completeness Report: Following will help you to decide whether transaction configuration is complete
    1. Use FAES for identifying large SCC group
    2. Use FAER for validating transaction completeness (it includes smell test)
    3. Use FAET for identifying missing links in empty transactions
  13. Transaction Complete: Verify FAES, FAER & FAET reports to ensure that the transactions are configured properly and complete
    1. If empty transactions exist, follow the steps below

      1. If this is a configuration issue, go back to analysis configuration

      2. If this is a standard framework supported by CAST, raise a Support ticket

      3. Add links between objects that are not recognized by AIP out of the box only if there is an approved workaround from support and R&D (through the Reference Pattern process, Knowledge Base (KB) update queries, and supported extensions to create missing objects/links)

      4. Once the link is completed in Enlighten, compute in the Transaction Configuration Center (TCC), validate the flow, and continue with activity in Step b.

  14. Run Snapshot and Validate results:
    1. Run Baseline snapshot
    2. UI validation
      1. Launch dashboard either via CMS or deployed Web Archive (WAR)
      2. Validate if Engineering Dashboard pages are displaying data correctly e.g. Transaction wide Risk Index (TwRI) page, Risk Indicator pages, Critical Violation pages, Compare Version pages
      3. If there is an issue with the above step, follow the Troubleshooting Process
    3. Inventory Validation
      1. Check the Lines of Code (LoC) per modules and validate with the inventory
      2. Validate the number of files, classes, tables, and artefacts per technology. If the numbers are not as expected, follow the Troubleshooting Process
      3. If this check is for maintenance, validate the variation in data from Quick Access view
      4. If there are a lot of unexpected added/deleted artefacts in the above step, follow the  Troubleshooting Process to fix the issue. If the issue is due to folder structures, reject the delivery and start from DMT phase
    4. Engineering Dashboard Metrics validation:
      1. Check the number of critical violations and validate a few of them to see if there are any false positives. If there is any false positive, follow the Troubleshooting Process
      2. Check the other metrics such as Technical Debt, Cyclomatic Complexity
      3. If this process is for maintenance, check the evolution
      4. Function Point (FP) count must match with FP count in TCC
    5. Analytics Dashboard Metrics validation:
      1. Consolidate Engineering Dashboard with Application Analytics Dashboard (Analytics Dashboard) if Engineering Dashboard can’t be navigated to from Analytics Dashboard
      2. Validate that the numbers displayed on the Analytics Dashboard must match with Engineering Dashboard
  15. Verify Analysis Completion Report: Use FAEN and FAER reports to ensure that the quality of analysis is good
  16. All Checks Passed: A list of checks are executed at this step to ensure the quality of results. If any check fails then it will lead to an exception review
  17. QA - Onboard phase:
    1. Review the Analysis results, entry and endpoints as defined in the transaction report, validate the smell test results justification, dashboards and delivery report as per the checklist
    2. PQM has to review the process followed so far and give the go-ahead to deliver the refined dashboard to the customer
  18. Consumption Report:
    1. Upon the go recommendation of PQM, delivery the refined measurement and Consumption Report to the application team

...