Using the Module Assistant
Overview
The Module Assistant will generate an automated functional breakdown of your application(s) based on the application code’s wording, sentences and topics identified and correlated to the adherence of elements between them. Several factors can affect the outcomes and keep in mind that for some applications, the results may not be perfect, especially if the source code lacks understandable wording.
How does it work?
Technically, the Module Assistant will create a custom aggregation (called Automated Functional Modules) for each targeted application. This custom aggregation is generated based on keywords found in your target application: related keywords will be grouped together as modules, and links will be created between the modules based on the underlying objects within the modules. This custom aggregation can be selected by any user in the Perspective > Aggregated by section of the left-panel for each targeted application.
Technical details
This feature automatically analyzes and organizes your application’s functional elements (such as objects, subObjects, and relationships) into cohesive functional modules. It uses Natural Language Processing (NLP), graph-based community detection, and Neo4j relationship analysis to:
- Identify related components across your application,
- Group them into meaningful communities (modules),
- Publish them back into your CAST repository as reusable functional modules.
- Use name / full name and source code comments, no actual source code
When the feature is run, the following occurs:
- Extracts and prepares application texts
- Fetches textual elements (descriptions, names, labels, etc.) from the application in Neo4j.
- Cleans and normalizes text by removing noise, stopwords, and irrelevant content.
- Applies lemmatization, translation, and linguistic filtering for consistent text analysis.
- Builds a semantic network
- Creates a graph of relationships between sentences and key terms.
- Identifies word co-occurrence and dependency patterns using NLP.
- Constructs both sequence-based (word order) and composition-based (sentence structure) relationships.
- Detects functional communities
- Uses multiple community detection algorithms (Label Propagation, Asynchronous LPA, Louvain) to find clusters in the semantic graph.
- Each cluster represents a functional community - a set of related concepts or objects sharing a similar context or purpose.
- The best model is automatically selected based on community stability and size distribution.
- Builds object-level modules
- Creates a graph of application objects linked by their logical and structural relationships.
- Detects functional groupings (modules) among objects and calculates community strength for each.
- Publishes functional modules
- Generates CSV files representing detected modules, sentences, and communities.
- Loads and injects these modules back into Neo4j
- Publishes modules as Functional module, making them visible for further analysis and reporting.
- Optionally creates unpublished aggregations for future refinement.
How do I access the feature?
Use the Customize the results option in the Application Landing page:

Then choose the Modules Assistant tab:

Are there any prerequisites?
Microsoft Windows
You must enable the Beta: Use Unicode UTF-8 for worldwide language support option as follows on the machine hosting your Viewer services:
Settings > Time & Language > Language & Region or Language > Administrative Language Settings > Change system locale

Linux via Docker
No prerequisites.
How do I generate the results?
Use the Generate Module option:

Use the queue icon to visualize the progress:

How do I view the results?
Use the dedicated Automated Functional Modules custom aggregation available in the left-panel:

Optional: Improve Module Names with AI (LLM)
When you provide an API key for an AI provider as described in AI Settings, CAST Imaging will automatically use a Large Language Model (LLM) to generate improved, human-friendly module names based on the detected community content (top terms and representative sentences). This is an optional, opt-in feature that refines module names for clarity and business-readability.
How it works (high level)
- For each detected community, the system builds a short summary:
- Top lemmas/terms from the community
- Representative sentences extracted from the community
- If an AI API key is configured and the LLM feature is enabled, the system sends a compact prompt containing the summary to the configured LLM endpoint.
- The LLM returns one or more candidate names for the module.
- The best candidate is attached to the community record and used as the module name when publishing into Neo4j
Where it runs
The LLM call is made server-side from the component running Automated Functional Aggregation.