This documentation is not maintained. Please refer to doc.castsoftware.com/technologies to find the latest updates.

Summary: This document provides basic information about the extension Node.js + Express support for Web applications.

Extension ID

com.castsoftware.nodejs

What's new?

Please see Node.js - 2.8 - Release Notes for more information.

Description

This extension provides support for Node.jsNode.js is a JavaScript runtime built on Chrome's V8 JavaScript engine. Node.js uses an event-driven, non-blocking I/O model that makes it lightweight and efficient. 

CAST recommends using this extension with HTML5 and JavaScript ≥ 2.0.0 for the best results.

In what situation should you install this extension?

Regarding Front-End to Back-End connections, we do support the following cross-technology stacks:

iOS Front-End connected to Node.js/PostgreSQL Back-endiOS Front-End connected to Node.js/MSSQL Back-endAngularJS Front-End connected to Node.js/MongoDB Back-end

If your Web application contains Node.js source code and you want to view these object types and their links with other objects, then you should install this extension:

  • creates a Node.js application object when an instance has been found
  • creates Node.js operations which represent entry-points of web services

Express framework

Click here to expand...

The following declarations will create a Node.js Get Operation:

app.get('/login', function (req, res) {
    "use strict";
    console.log('login ' + req.url);
    console.log('login ' + req.query.pseudo);
    var currentSession = getSessionId(req, res);
    datab.userExists(currentSession, req.query.pseudo, res, cbLogin);
});

and this one will create a NodeJS Service Operation:

var admin = express();

app.use('/admin', admin);

Hapi.js framework

Click here to expand...

Create a server - index.js:

const Hapi = require('hapi');

// Create Server
const server = new Hapi.Server();

Routes: create a route for server:

server.route([     
	{
        method: 'GET',
        path: '/api/directors/{id}',
        handler: api.directors.get,
        config: {
            tags: ['api'],
            description: 'Get one director by id',
            notes: 'Get one director by id',
            validate: {
                params: {
                    id: Joi.number().required()
                }
            },
            cors: {
                origin: ['*']
            }
        }
    }
];

Sails.js framework

Click here to expand...

Create a server: app.js.

...
  // Start server
  sails.lift(rc('sails'));
...


Routes control at config/routes.js:

...
'GET /site/:idSite' : {controller: "Site", action: "getSite", rel: RelServices.REL_ENUM.GET_VIEWED_SITE},
...
'PUT /alert' : {controller: "Alert", action: "putAlert", rel: RelServices.REL_ENUM.PUT_ALERT, profile: ProfileServices.PROFILE_ENUM.OPERER},
...

Controller actions:

SiteController.js
...
self.getSite = function (req, res) {
  ...
  var promise = Site.findOne({
    idSite: idSite
  });
  ...
};


AlertController.js
...
self.putAlert = function (req, res) {
  ...
  var promise = Alert.findOne({
    alertId: alertId
  });
  ...
};

Model definition:

Site.js:
...
self.connection = 'postgresqlServer';

self.tableName = 'T_SITE';


self.attributes = {
...
}
...
Alert.js
...
self.connection = 'postgresqlServer';

self.tableName = 'T_ALERT';


self.attributes = {
...
}
...

Transaction from get operation method to database when using SQL analyzer:

Loopback framework

Click here to expand...

Create webservice from Express API

The App extends and supports Express Middleware. Webservice can be supported as API Express framework:

var loopback = require('loopback');
var app = loopback();

// Create get method
app.get('/', function(req,res){
res.send('hello wor;ld')
});

app.listen(3000);

Create webservice from model

Model todo.js:

module.exports = function(Todo) {
  Todo.stats = function(filter, cb) {
  ...
  };
  ...
  Todo.remoteMethod('stats', {
    accepts: {arg: 'filter', type: 'object'},
    returns: {arg: 'stats', type: 'object'},
    http: { path: '/stats' }
  }, Todo.stats);
  ...
}

Exposing models over REST: https://loopback.io/doc/en/lb3/Exposing-models-over-REST.html. LoopBack models automatically have a standard set of HTTP endpoints that provide REST APIs.

Example: todo.json:

{
  "name": "Todo",
  "base": "PersistedModel",
  "strict": "throw",
  "persisteUndefinedAsNull": true,
  "trackChanges": true,
  "properties": {
    "id": {
      "id": true,
      "type": "string",
      "defaultFn": "guid"
    },
    "title": "string",
    "completed": {
      "type": "boolean",
      "default": false
    },
    "created": {
      "type": "number"
    }
  }
}

 

Koa.js framework

Click here to expand...

Webservice application from Koa:

var koa = require('koa'),
  router = require('koa-router'),
  cors = require('koa-cors'),
  json = require('koa-json'),
  errorHandler = require('koa-onerror'),
  bodyParser = require('koa-body')(),
  app = koa(),
  routes = new router();


function render(controller, action) {
	...
}

/* routes start */

routes.get(  '/todos',                     render('todos',     'all'));
routes.post( '/todos',        bodyParser,  render('todos',     'create'));
routes.get(  '/todos/:id',                 render('todos',     'show'));
routes.del(  '/todos/:id',                 render('todos',     'delete'));
routes.patch('/todos/:id',    bodyParser,  render('todos',     'update'));
routes.del(  '/todos',                     render('todos',     'deleteAll'));


app.use(require('./app/middlewares/request_logger')());
app.use(json());
app.use(cors({methods: ['GET', 'PUT', 'POST', 'PATCH', 'DELETE']}));
app.use(routes.middleware());

errorHandler(app);

app.listen(Number(process.env.PORT || 9000));

Knex.js framework

Click here to expand...

Knex.js is a "batteries included" SQL query builder for Postgres, MSSQL, MySQL, MariaDB, SQLite3, Oracle, and Amazon Redshift designed to be flexible, portable, and fun to use. We do not support the creation of tables for this framework. Example:

Define database config:

const Config = require('../config');

module.exports = {
  client: 'postgresql',
  connection: Config.DATABASE_URL || {
    database: Config.DB_NAME,
    host: Config.DB_HOST,
    username: Config.DB_USER,
    password: Config.DB_PASSWORD
  }
};

Create bookshelf from Knex and Bookshelf:

const DatabaseConfig = require('../../db');

const Bookshelf = require('bookshelf');
const Knex      = require('knex')(DatabaseConfig);

module.exports = Bookshelf(Knex);


Add model for bookshelf:

const Bookshelf = require('../util/bookshelf');

const Config = require('../../config');

module.exports = Bookshelf.Model.extend({
  tableName: 'todos',
  url: function () {
    return `${Config.DOMAIN}/${this.get('id')}`;
  },
  serialize: function () {
    return {
      id: this.get('id'),
      title: this.get('title'),
      url: this.url(),
      completed: this.get('completed'),
      order: this.get('order'),
      object: 'todo'
    };
  }
});

Define method for model:

const Todo = require('../../models/todo');

exports.deleteAll = () => {
  // hack to get around Bookshelf's lacking destroyAll
  return new Todo().where('id', '!=', 0).destroy()
  .then(() => []);
};

Access model from webservice method:

exports.register = (server, options, next) => {

server.route([{
    method: 'DELETE',
    path: '/',
    config: {
      handler: (request, reply) => {
        reply(Controller.deleteAll());
      }
    }
  },
])
}

If table isn't found from external, an unknown database table will be created:

Node.js MQTT

Click here to expand...

Controller.js defines a publisher with a messager:

function openGarageDoor () {
  // can only open door if we're connected to mqtt and door isn't already open
  if (connected && garageState !== 'open') {
    // Ask the door to open
    client.publish('garage/open', 'true')
  }
}


function closeGarageDoor () {
  // can only close door if we're connected to mqtt and door isn't already closed
  if (connected && garageState !== 'closed') {
    // Ask the door to close
    client.publish('garage/close', 'true')
  }
}

garage.js defines a subscriber as:

client.on('connect', () => {
  client.subscribe('garage/open')
  client.subscribe('garage/close')

  // Inform controllers that garage is connected
  client.publish('garage/connected', 'true')
  sendStateUpdate()
})


client.on('message', (topic, message) => {
  console.log('received message %s %s', topic, message)
  switch (topic) {
    case 'garage/open':
      return handleOpenRequest(message)
    case 'garage/close':
      return handleCloseRequest(message)
  }
})

Node.js Seneca Microservice

Click here to expand...

Create a service:

web-app.js:

var seneca = require('seneca')()

seneca
  .use('user')
  .use('auth')
  .use('../lib/api.js')
  .client({port:10202,pin:{role:'offer',cmd:'*'}})
  .client({port:10201,pin:{role:'user',cmd:'*'}})


var app = express()

app.use( bodyParser.json() )
app.use( seneca.export('web') )
app.use( express.static('./public') )

app.listen(3000)

offer-service:

require('seneca')()
  .use('../lib/offer')
  .listen(10202)
  .ready(function(){
    this.act({role:'offer',cmd:'provide'},console.log)
  })

Define the api.js:

module.exports = function( options ) {
var seneca = this
var plugin = 'api'

seneca.add( {role:plugin, end:'offer'}, end_offer) 

function end_offer( args, done ) {
var user = args.req$.seneca.user || {}

this.act('role:offer,cmd:provide',{nick:user.nick},done)
}

seneca.act({role:'web', use:{
prefix:'/api/',
pin:{role:plugin,end:'*'},
map:{
'offer': { GET:true },
}
}})

return {name:plugin};
}

offer.js:

module.exports = function( options ) {
  var seneca = this
  var plugin = 'offer'


  seneca.add( {role:plugin, cmd:'provide'}, cmd_provide)
  

  function cmd_provide( args, done ) {
    if( args.nick ) return done(null,{product:'Apple'});

    return done(null,{product:'Orange'});
  }


  return {name:plugin};
}

When a service sends an action (seneca.act()):

Click to enlarge

Webservice RestAPI:

...

    seneca.act('role:web',{use:{
      prefix:'/product',
      pin:'role:api,product:*',
      startware: verify_token,
      map:{
        star: { 
          GET:true,
          alias:'/:id/star' 
        },
        handle_star:{
          PUT:true,
          DELETE:true,
          POST:true,
          alias:'/:id/star'
        }
      }
...

Click to enlarge

Supported Node.js versions

VersionSupportComment
v0.x(error)No longer supported
v4.x(tick)LTS
v5.x(tick)Based on Javascript ES6
v6.x(tick)Based on Javascript ES6

v7.x

(tick)Based on Javascript ES6
v8.x(tick)
v9.x(tick)
v10.x(tick)
v11.x(tick)
v12.x(tick) 
v13.x(tick)
v14.x(tick)
v15.x(tick)
v16.x(tick)
v17.x(tick)

Node.js Ecosystem

Node.js comes with numerous libraries and frameworks bringing data access, web services calls, microservices architectures. This list contains all supported libraries:

LibraryCommentData AccessWeb ServiceMessaging
AWS.DynamoDBAmazon database access(tick)

AWS.S3Amazon storage service(tick)

AWS.SQSAmazon messaging service

(tick)
AWS.LambdaAmazon routing solution
(tick)
Azure blobsAzure storage service(tick)

Azure Service BusAzure Queue Service

(tick)
CosmosDBMicrosoft Azure NoSQL Database solution(tick)

CouchdbCouchdb access(tick)

Couchdb-nanoCouchdb access(tick)

elasticsearchOpen-source search engine (tick)

ExpressNode.js application framework
(tick)
HapiNode.js application framework(tick)(tick)
KnexNode.js SQL query builder (tick)

KoaNode.js application framework(tick)

LoopbackNode.js application framework(tick)(tick)
MarklogicMarklogic access(tick)

MemcachedStorage framework(tick)

Mode-mongodb-nativeMongoDB access(tick)

Mongo-clientMongoDB access(tick)

MongooseMongoDB access(tick)

MQTTMessaging library

(tick)
mssqlSQL server(tick)

my_connectionMySQL access(tick)

myssqlNodejs module to manipulate MySQL database


Node-couchdbCouchdb access(tick)

node-sqlserverSQL server(tick)

oracledbOracle Database access(tick)

pgPostgreSQL access(tick)

redisRedis access(tick)

SailsNode.js application framework(tick)(tick)
SenecaMicroservice toolkit
(tick)

Function Point, Quality and Sizing support

This extension provides the following support:

  • Function Points (transactions): a green tick indicates that OMG Function Point counting and Transaction Risk Index are supported
  • Quality and Sizing: a green tick indicates that CAST can measure size and that a minimum set of Quality Rules exist
Function Points
(transactions)
(tick)
Quality and Sizing(tick)

Comparison with existing support for JavaScript

CAST AIP has provided support for analyzing JavaScript via its JEE and .NET analyzers (provided out of box in CAST AIP) for some time now. The HTML5/JavaScript extension (on which the Node.js extension depends) also provides support for JavaScript but with a focus on web applications. CAST highly recommends that you use this extension if your Application contains JavaScript and more specifically if you want to analyze a web application, however you should take note of the following:

  • You should ensure that you configure the extension to NOT analyze the back end web client part of a .NET or JEE application.
  • You should ensure that you configure the extension to ONLY analyze the front end web application built with the HTML5/JavaScript that communicates with the back end web client part of a .NET or JEE application.
  • If the back end web client part of a .NET or JEE application is analyzed with the Node.js extension and with the native .NET/JEE analyzers, then your results will reflect this - there will be duplicate objects and links (i.e. from the analyzer and from the extension) therefore impacting results and creating erroneous Function Point data.

In CAST AIP 8.3.x support for analyzing JavaScript has been withdrawn from the JEE and .NET analyzers.

AIP Core compatibility

This extension is compatible with:

AIP Core release
Supported
8.3.x(tick)

Supported DBMS servers

DBMSSupported?
CSS / PostgreSQL(tick)

Prerequisites

(tick)An installation of any compatible release of AIP Core (see table above)

Dependencies with other extensions

Some CAST extensions require the presence of other CAST extensions in order to function correctly. The Node.js extension requires that the following other CAST extensions are also installed:

Note that when using the CAST Extension Downloader to download the extension and the Manage Extensions interface in CAST Server Manager to install the extension, any dependent extensions are automatically downloaded and installed for you. You do not need to do anything.

Download and installation instructions

The extension will be automatically downloaded and installed in CAST Console. You can manage the extension using the Application - Extensions interface:

Packaging, delivering and analyzing your source code

Once the extension is downloaded and installed, you can now package your source code and run an analysis. The process of packaging, delivering and analyzing your source code is described below:

Click here to expand...

Packaging and delivery

Note that the jQuery extension does not contain any CAST Delivery Manager Tool discoverers or extractors, therefore, no "jQuery" projects will be detected. However, the Web Files Discoverer extension will be automatically installed (it is a "shipped" extension which means it is delivered with AIP Core) and will automatically detect projects as HTML5 if specific files are delivered, therefore ensuring that Analysis Units are created for your source code.

Using CAST Console

Using CAST Management Studio

Click here to expand...
  • create a new Version
  • create a new Package for your Node.js source code using the Files on your file system option:

  • Define the root folder of your Application source code:

  • Run the Package action
  • Before delivering the source code, check the packaging results:
Without the Web Files Discover

If you are not using the Web Files Discoverer, the following will occur:

  • the CAST Delivery Manager Tool will not find any "projects" related to the Node.js application source code - this is the expected behaviour. However, if your Node.js related source code is part of a larger application (for example a JEE application), then other projects may be found during the package action (click to enlarge):

With the Web Files Discoverer

If you are using the Web Files Discoverer, the following will occur:

  • the CAST Delivery Manager Tool will automatically detect "HTML5 file projects" (see Web Files Discoverer for more technical information about how the discoverer works) related to the Node.js application source code. In addition, if your Node.js related source code is part of a larger application (for example a JEE application), then other projects may also be found during the package action (click to enlarge):

  • Deliver the Version

Analyzing

Using CAST Console

AIP Console exposes the technology configuration options once a version has been accepted/imported, or an analysis has been run. Click Universal Technology (3) in the Config (1) > Analysis (2) tab to display the available options for your Node.js source code:

Then choose the relevant Analysis Unit (1) to view the configuration:

Using the CAST Management Studio

Click here to expand...


  • Accept and deploy the Version in the CAST Management Studio.
Without the Web Files Discover

If you are not using the Web Files Discoverer, the following will occur:

  • No Analysis Units will be created automatically relating to the Node.js source code - this is the expected behaviour. However, if your Node.js related source code is part of a larger application (for example a JEE application), then other Analysis Units may be created automatically:

  • In the Current Version tab, add a new Analysis Unit specifically for your Node.js source code, selecting the Add new Universal Analysis Unit option:

  • Edit the new Analysis Unit and configure in the Source Settings tab:
    • a name for the Analysis Unit
    • ensure you tick the HTML5/JavaScript option (the Node.js extension depends on the HTML5 and JavaScript extension - and therefore the Universal Analyzer language for the AngularJS extension is set as HTML5/JavaScript)
    • define the location of the deployed Node.js source code (the CAST Management Studio will locate this automatically in the Deployment folder):

  • Run a test analysis on the Analysis Unit before you generate a new snapshot.
With the Web Files Discoverer

If you are using the Web Files Discoverer, the following will occur:

  • "HTML5" Analysis Units will be created automatically (see Web Files Discoverer for more technical information about how the discoverer works) related to the Node.js application source code. In addition, if your Node.js related source code is part of a larger application (for example a JEE application), then other Analysis Units may also be created:

  • There is nothing further to do, you can now run a test analysis on the Analysis Unit before you generate a new snapshot.

Analysis warning and error messages

Click here to expand...

Message ID
Message Type

Logged during

Impact
Remediation
Action
NODEJS-001WarningAnalysisAn internal issue occured when parsing a statement in a file. A part of a file was badly analyzed.

 

Contact CAST Technical Support

What results can you expect?

Once the analysis/snapshot generation has completed, you can view the results in the normal manner (for example via CAST Enlighten):

Node.js application with MongoDB data storage exposing web services

Objects

The following specific objects are displayed in CAST Enlighten:

IconDescription

Node.js Application

Node.js Port

Node.js Delete Operation Service

Node.js Get Operation Service
Node.js Post Operation Service
Node.js Put Operation Service

Node.js Service

Node.js Express Use

Node.js Express Controller

Node.js Get Http Request Service

Node.js Post Http Request Service

Node.js Put Http Request Service

Node.js Delete Http Request Service

Node.js Unknown Database

Node.js Collection

Node.js Memcached Connection

Node.js Memcached Value

Node.js Call to Java Program

Node.js Call to Generic Program

Node.js Restify Get Operation

Node.js Restify Post Operation

Node.js Restify Put Operation

Node.js Restify Delete Operation

Node.js AWS SQS Publisher
Node.js AWS SNS Publisher
Node.js Azure Service Bus Publisher

Node.js AWS SQS Receiver
Node.js AWS SNS Subscriber
Node.js Azure Service Bus Receiver

Node.js AWS SQS Unknown Publisher
Node.js AWS SNS Unknown Publisher
Node.js Azure Unknown Service Bus Publisher

Node.js AWS SQS Unknown Receiver
Node.js AWS SNS Unknown Subscriber
Node.js Azure Unknown Service Bus Receiver

Node.js AWS call to Lamba Function

Node.js AWS call to unknown Lambda Function

Node.js S3 Bucket 
Node.js Azure Blob Container

Node.js S3 Unknown Bucket
Node.js Azure Unknown Blob Container

NodeJS Unknown Database Table


External link behavior

Behaviour is different depending on the version of CAST AIP you are using the extension with:

  • From 7.3.6, SQL queries are sent to the external links exactly like standard CAST AIP analyzers.
  • From 7.3.4 and before 7.3.6, a degraded mode takes place: The Node.js extension analyzes the FROM clause to retrieve table names, then sends the table names only to external links.
  • For all versions, if no links are found via external links, unresolved objects are created (with type CAST_NodeJS_Unknown_Database_Table).

Connector per RDBMS Vendor

Oracle "oracledb" connector

Connector "oracledb"
var oracledb = require('oracledb');
connection = oracledb.getConnection(
  {
    user          : "hr",
    password      : "welcome",
    connectString : "localhost/XE"
  }
);
connection.execute(
      "SELECT department_id, department_name FROM departments WHERE department_id < 70",
      function(err, result)
      {
        if (err) { console.error(err); return; }
        console.log(result.rows);
      }
  );

MS SQL "node-sqlserver" and "mssql" connectors

Connector "node-sqlserver"
var sql = require('node-sqlserver');
//
var connStr = "Driver={SQL Server Native Client 11.0};Server=myySqlDb,1433;Database=DB;UID=Henry;PWD=cat;";
var query = "SELECT * FROM GAData WHERE TestID = 17";
sql.open(connStr, function(err,conn){
    if(err){
        return console.error("Could not connect to sql: ", err);
    }
    conn.queryRaw("SELECT TOP 10 FirstName, LastName FROM authors", function (err, results) {
        if (err) {
            console.log("Error running query!");
            return;
        }
        for (var i = 0; i < results.rows.length; i++) {
            console.log("FirstName: " + results.rows[i][0] + " LastName: " + results.rows[i][1]);
        }
    });
});
var match = "%crombie%";
sql.query(conn_str, "SELECT FirstName, LastName FROM titles WHERE LastName LIKE ?", [match], function (err, results) { 
    for (var i = 0; i < results.length; i++) {
        console.log("FirstName: " + results[i].FirstName + " LastName: " + results[i].LastName);
    }
});
Connector "mssql"
var sql = require('mssql');
var config = {
    user: '...',
    password: '...',
    server: 'localhost', // You can use 'localhost\\instance' to connect to named instance 
    database: '...',
     
    options: {
        encrypt: true // Use this if you're on Windows Azure 
    }
}
  
var connection = new sql.Connection(config, function(err) {
    // ... error checks 
     
    // Query 
     
    var request = new sql.Request(connection); // or: var request = connection.request(); 
    request.query('select * from authors', function(err, recordset) {
        // ... error checks 
         
        console.dir(recordset);
    });
     
    // Stored Procedure 
     
    var request = new sql.Request(connection);
    request.input('input_parameter', sql.Int, 10);
    request.output('output_parameter', sql.VarChar(50));
    request.execute('procedure_name', function(err, recordsets, returnValue) {
        // ... error checks 
         
        console.dir(recordsets);
    });
     
});

PostgreSQL "pg" connector

Connector "pg"
var pg = require("pg");
var conString = "pg://operator:CastAIP@localhost:2280/postgres";
var client = new pg.Client(conString);
client.connect();
var querySchemas = client.query("select nspname from pg_catalog.pg_namespace");
querySchemas.on("row", function (row, result) {
    "use strict";
    result.addRow(row);
});
querySchemas.on("end", function (result) {
    "use strict";
    console.log(result.rows);
    client.end();
});

MySQL "my_connection" connector

Connector "my_connection"
var connection = require("my_connection");
connection.query('my_url', 
			function result_getCatLogDetails(getCatLogDetails_err, getCatLogDetails_rows, 
			getCatLogDetails_fields) {
		
				if (getCatLogDetails_err) {
			        logContent += '|ERROR'+";";
					logContent += getCatLogDetails_err.message+";";
					utils.logAppDetails(logContent);
			        deferred.reject(new Error(getCatLogDetails_err));
			    } else {
			        deferred.resolve(getCatLogDetails_rows);
			    }
			});

Connector per NoSQL Vendor

Even if we don't have NoSQL server side representation, we will create a client side representation based on the API access. Node.js analyzer will create links from Javascript functions to NoSQL "Database" or "Table" equivalents as follows:

Azure

Azure Service Bus

Click here to expand...

NodeJS Azure Service Bus Publisher and NodeJS Azure Service Bus Receiver objects are created when the following methods are used:

Object created

Methods from ServiceBusClient from @azure/service-bus package


Methods from ServiceBusService from azure-sb package
NodeJS Azure Service Bus Publisher

sendMessages,
scheduleMessages

createSender, createReceiver
NodeJS Azure Service Bus ReceiverreceiveMessages, peekMessages, subscribe, getMessageIterator, receiveDeferredMessagesreceiveQueueMessage, receiveSubscriptionMessage

The name of the object created is that of the Queue (or the Topic). Whenever the evaluation of the Queue (or Topic) name fails, an Unknown object is created.
For publishers, a callLink  from the callable object (usually Function or Method) containing the call (to the supported method) to the publisher is added (see example below).
For receivers, some APIs (such as the subscribe API) require providing a handler function, in such case a callLink is created from the receiver to that handler. In other cases, a callLink is added from the receiver to the callable object (usually a Function or a Method) containing the call to the supported method (see example below).
Whenever an Azure Service Bus Publisher has the same name as an Azure Service Bus Receiver, the web service linker extension will create a call link from the Publisher to the Receiver.

Example

When analyzing the following source code:

import { ServiceBusClient, ServiceBusMessage, ServiceBusMessageBatch } from "@azure/service-bus";

function my_publish(Messages){
	const sbClient = new ServiceBusClient(connectionString);
    const sender = sbClient.createSender("MyQueue");
	await sender.sendMessages(Messages);
}

function my_receive(){
    const sbClient = new ServiceBusClient(connectionString);
	let receiver = sbClient.createReceiver("MyQueue", {
		receiveMode: "receiveAndDelete"
	});
	let messages = await receiver.receiveMessages(1);
}

you will get the following result:

Support for blobs

Click here to expand...

Whenever a call to a method carrying a CRUD operation on an Azure Blob is found in the source code, this extension evaluates the name of the container in which the operation is made and a link is created from the caller of the method to that blob container. The list of supported methods and the type of links created are listed in the following table. If the evaluation of the container name fails (either due to missing information in the source code or to limitations in the evaluation) a link is created to an Unknown container.
For methods copying a blob from one container to another, a useSelectLink is created to the source container and both a useInsertLink and a useUpdateLink are created to the destination container.

LinkType

Methods from containerClient

const {BlobServiceClient} = require("@azure/storage-blob");
const blobServiceClient = new BlobServiceClient(...);
const containerClient = blobServiceClient.getContainerClient(...)

Methods from BlobClients (blockBlobClient,
PageBlobClient, AppendBlobClient or BlobBatchCLient)

const {BlobServiceClient} = require("@azure/storage-blob")
const blobServiceClient = new BlobServiceClient(...)
const containerClient = blobServiceClient.getContainerClient(...)
const blockBlobClient = containerClient.getBlockBlobClient(...)

Methods from BlobServices

const azure = require('azure-storage')
const blobService = azure.createBlobService(...)
useInsertuploadBlockBlobsyncUploadFromURL, upload', uploadPages, uploadPagesFromURL, beginCopyFromURLcreateAppendBlobFromBrowserFile, createAppendBlobFromLocalFile, createAppendBlobFromStream, createAppendBlobFromText, createBlobSnapshot, createBlobSnapshot, createBlockBlobFromLocalFile, createBlockBlobFromStream, createBlockBlobFromText, createBlockFromStream, createBlockFromText, createBlockFromURL, createOrReplaceAppendBlob, createPageBlob, createPageBlob, createPageBlobFromLocalFile, createPageBlobFromStream, createPagesFromStream, createWriteStreamToBlockBlob, createWriteStreamToBlockBlob, createWriteStreamToNewAppendBlob, createWriteStreamToNewPageBlob, startCopyBlob
useUpdateuploadBlockBlobcommitBlockList, stageBlock, stageBlockFromURL, syncUploadFromURL, upload, uploadBrowserData
'ploadData, uploadFile, uploadStream, uploadPages, uploadPagesFromURL, appendBlock, appendBlockFromURL, beginCopyFromURL, startCopyIncremental
appendBlockFromStream, appendBlockFromText, appendFromBrowserFile, appendFromLocalFile, appendFromStream, appendFromText, commitBlocks, createAppendBlobFromBrowserFile, createAppendBlobFromLocalFile, createAppendBlobFromStream, createAppendBlobFromText, createBlobSnapshot, createBlockBlobFromLocalFile, createBlockBlobFromStream, createBlockBlobFromText, createBlockFromStream, createBlockFromText, createBlockFromURL, createOrReplaceAppendBlob, createPageBlob, createPageBlob, createPageBlobFromLocalFile, createPageBlobFromStream, createPagesFromStream, createWriteStreamToBlockBlob, createWriteStreamToBlockBlob, createWriteStreamToExistingAppendBlob, createWriteStreamToExistingAppendBlob, createWriteStreamToExistingPageBlob, startCopyBlob
useDeletedeleteBlob, deleteIfExistsclearPages, deleteBlobs, delete, deleteIfExistsdeleteBlob, deleteBlobIfExists, deleteContainer, deleteContainerIfExists
useSelect
getBlockList, query, createSnapshot, download, downloadToBuffer, downloadToFile, beginCopyFromURL, startCopyIncrementalcreateBlobSnapshot, createReadStream, getBlobToLocalFile, getBlobToStream, getBlobToText, startCopyBlob, createBlockFromURL

Example

When analyzing the following code, a blob container named my_container is created as well as a useInsert and a useUpdate links from the main function to that container  

const { BlobServiceClient } = require("@azure/storage-blob");

const blobServiceClient = new BlobServiceClient(account_url, defaultAzureCredential);

async function main() {
  const containerClient = blobServiceClient.getContainerClient("my_container");
  const content = "Hello world!";
  const blockBlobClient = containerClient.getBlockBlobClient("blobName");
  const uploadBlobResponse = await blockBlobClient.upload(content, content.length);
}

 


Amazon Web Services (AWS)

Support for lambda

Click here to expand...

Lambda services allow executing some source code on the cloud. The execution can be set to be triggered by some AWS events. 

Lambda functions can be deployed using several deployment frameworks. The supported deployment frameworks are listed on this page.

When a lambda function is created and its runtime is nodejs, the current extension is responsible for linking the lambda objects and their triggers with the java handler functions.

Example

Let us consider a source code defining a lambda function that has two triggers: an SQS queue and an API Gateway. The lambda function has a nodejs runtime and the handler function is given by the handler function fullname. 

If the lambda function is deployed using a supported deployment framework (such as CloudFormation), the analysis will create a lambda function, an SQS receiver, and an API Gateway objects. Each of these objects has a runtime property (nodejs) and a handler property with the function fullname. 

If the current extension finds a JavaScript function matching the handler fullname a link to that function will be added from the lambda function, the SQS queue and the API Gateway objects.

Lambda Invocation

A lambda can be executed through an invocation. A NodeJS Call to AWS Lambda Function is then created. Its name is that of the invoked lambda function. com.castsoftware.wbslinker will then link that object to any matching lambda objects. Invocation is currently only supported for SDK V2

Click here to expand...

Example for AWS Labda sdk v2


var AWS = require('aws-sdk');
// Set the region
AWS.config.update({region: 'REGION'});
AWS.config.credentials = new AWS.CognitoIdentityCredentials({IdentityPoolId: 'IDENTITY_POOL_ID'});

// Prepare to call Lambda function.
var lambda = new AWS.Lambda({region: 'REGION', apiVersion: '2015-03-31'});
var pullParams = {
  FunctionName : 'Function_Name',
  InvocationType : 'RequestResponse',
  LogType : 'None'
};

function run() {
  // Call the Lambda function
  lambda.invoke(pullParams);
}
run();





Support for SQS


Methods from SDK V2 sqs client

Commands from SDK V3

imported from '@aws-sdk/client-sqs'

Publish
  • sendMessage

  • sendMessageBatch
  • SendMessageCommand
Receive
  • receiveMessage
  • ReceiveMessageCommand

Code samples for sdk v2

This code will publish a message into the "SQS_QUEUE_URL" queue:

// Load the AWS SDK for Node.js
var AWS = require('aws-sdk');
// Set the region 
AWS.config.update({region: 'REGION'});

// Create an SQS service object
var sqs = new AWS.SQS({apiVersion: '2012-11-05'});

var params = {
   // Remove DelaySeconds parameter and value for FIFO queues
  DelaySeconds: 10,
  MessageAttributes: {
    "Title": {
      DataType: "String",
      StringValue: "The Whistler"
    },
    "Author": {
      DataType: "String",
      StringValue: "John Grisham"
    },
    "WeeksOn": {
      DataType: "Number",
      StringValue: "6"
    }
  },
  MessageBody: "Information about current NY Times fiction bestseller for week of 12/11/2016.",
  // MessageDeduplicationId: "TheWhistler",  // Required for FIFO queues
  // MessageGroupId: "Group1",  // Required for FIFO queues
  QueueUrl: "SQS_QUEUE_URL"
};

sqs.sendMessage(params, function(err, data) {
  if (err) {
    console.log("Error", err);
  } else {
    console.log("Success", data.MessageId);
  }
});

This code will receive a message from the queue "SQS_QUEUE_URL":

// Load the AWS SDK for Node.js
var AWS = require('aws-sdk');
// Set the region 
AWS.config.update({region: 'REGION'});

// Create an SQS service object
var sqs = new AWS.SQS({apiVersion: '2012-11-05'});

var params = {
   // Remove DelaySeconds parameter and value for FIFO queues
  DelaySeconds: 10,
  MessageAttributes: {
    "Title": {
      DataType: "String",
      StringValue: "The Whistler"
    },
    "Author": {
      DataType: "String",
      StringValue: "John Grisham"
    },
    "WeeksOn": {
      DataType: "Number",
      StringValue: "6"
    }
  },
  MessageBody: "Information about current NY Times fiction bestseller for week of 12/11/2016.",
  // MessageDeduplicationId: "TheWhistler",  // Required for FIFO queues
  // MessageGroupId: "Group1",  // Required for FIFO queues
  QueueUrl: "SQS_QUEUE_URL"
};

sqs.receiveMessage(params, function(err, data) {
  if (err) {
    console.log("Error", err);
  } else {
    console.log("Success", data.MessageId);
  }
});

What results can you expect?

Once the analysis/snapshot generation has completed, you can view the results in the normal manner (for example via CAST Enlighten):

Click to enlarge

When the evaluation of the queue name fails, a Node.js AWS SQS Unknown Publisher (or Receiver) will be created.


Code samples for sdk v3

This code will publish a message into the "SQS_QUEUE_URL" queue:

const { SQSClient } = require("@aws-sdk/client-sqs");
const REGION = "us-east-1";
const sqsClient = new SQSClient({ region: REGION });
export  { sqsClient };

const { SendMessageCommand } = require("@aws-sdk/client-sqs");

const params = {
  DelaySeconds: 10,
  MessageAttributes: {
    Title: {
      DataType: "String",
      StringValue: "The Whistler",
    },
    Author: {
      DataType: "String",
      StringValue: "John Grisham",
    },
    WeeksOn: {
      DataType: "Number",
      StringValue: "6",
    },
  },
  MessageBody: "Information about current NY Times fiction bestseller for week of 12/11/2016."
  QueueUrl: "SQS_QUEUE_URL"

};

const run = async () => {
  try {
    const data = await sqsClient.send(new SendMessageCommand(params));
    console.log("Success, message sent. MessageID:", data.MessageId);
    return data; // For unit tests.
  } catch (err) {
    console.log("Error", err);
  }
};
run();

This code will receive a message from the queue "SQS_QUEUE_URL":

const { SQSClient } = require("@aws-sdk/client-sqs");
const REGION = "us-east-1";
const sqsClient = new SQSClient({ region: REGION });
export  { sqsClient };

const { ReceiveMessageCommand } = require("@aws-sdk/client-sqs");

const queueURL = 'SQS_QUEUE_URL'; 
const params = {
  AttributeNames: ["SentTimestamp"],
  MaxNumberOfMessages: 10,
  MessageAttributeNames: ["All"],
  QueueUrl: queueURL,
  VisibilityTimeout: 20,
  WaitTimeSeconds: 0,
};

const receive = async () => {
  try {
    const data = await sqsClient.send(new ReceiveMessageCommand(params));
    if (data.Messages) {
      console.log("messages obtained");
    } else {
      console.log("No messages");
    }
    return data; // For unit tests.
  } catch (err) {
    console.log("Receive Error", err);
  }
};
receive();

What results can you expect?

Once the analysis/snapshot generation has completed, you can view the results in the normal manner (for example via CAST Enlighten):

When the evaluation of the queue name fails, a Node.js AWS SQS Unknown Publisher (or Receiver) will be created.


Support for AWS S3

Link Type

Methods from SDK V2 s3client

import {AWS} from 'aws-sdk'
const s3client = new AWS.S3()

Methods from SDK V3 s3client

import {S3} from '@aws-sdk/client-s3'
const s3client = new S3()

Commands from SDK V3

imported from '@aws-sdk/client-s3'

No Link
  • createBucket

  • CreateBucketCommand
callLink
  • createMultipartUpload

  • createPresignedPost

  • abortMultipartUpload

  • completeMultipartUpload

  • deleteBucketAnalyticsConfiguration

  • deleteBucketCors

  • deleteBucketEncryption

  • deleteBucketInventoryConfiguration

  • deleteBucketLifecycle

  • deleteBucketMetricsConfiguration

  • deleteBucketPolicy

  • deleteBucketReplication

  • deleteBucketTagging

  • deleteBucketWebsite

  • deleteObjectTagging

  • deletePublicAccessBlock

  • getBucketAccelerateConfiguration

  • getBucketAcl

  • getBucketAnalyticsConfiguration

  • getBucketCors

  • getBucketEncryption

  • getBucketInventoryConfiguration

  • getBucketLifecycle

  • getBucketLifecycleConfiguration

  • getBucketLocation

  • getBucketLogging

  • getBucketMetricsConfiguration

  • getBucketNotification

  • getBucketNotificationConfiguration

  • getBucketPolicy

  • getBucketPolicyStatus

  • getBucketReplication

  • getBucketTagging

  • getBucketVersioning

  • getBucketWebsite

  • getObjectAcl

  • getObjectLegalHold

  • getObjectLockConfiguration

  • getObjectRetention

  • getObjectTagging

  • getPublicAccessBlock

  • getSignedUrl

  • listBuckets
  • listBucketAnalyticsConfigurations

  • listBucketInventoryConfigurations

  • listBucketMetricsConfigurations

  • listMultipartUploads

  • listObjectVersions

  • listParts

  • putBucketLogging
  • putBucketAnalyticsConfiguration
  • putBucketLifecycleConfiguration

  • putBucketMetricsConfiguration

  • putBucketNotification

  • putBucketNotificationConfiguration

  • putBucketPolicy

  • putBucketReplication

  • putBucketRequestPayment

  • putBucketTagging

  • putBucketVersioning

  • putObjectAcl

  • putObjectLegalHold

  • putObjectLockConfiguration

  • putObjectRetention

  • putObjectTagging

  • putPublicAccessBlock

  • putBucketAccelerateConfiguration

  • putBucketAcl

  • putBucketCors

  • putBucketEncryption

  • putBucketInventoryConfiguration

  • putBucketLifecycle

  • putBucketLogging
  • upload

  • uploadPart

  • uploadPartCopy

  • abortMultipartUpload
  • completeMultipartUpload
  • copyObject
  • createBucket
  • createMultipartUpload
  • deleteBucket
  • deleteBucketAnalyticsConfiguration
  • deleteBucketCors
  • deleteBucketEncryption
  • deleteBucketIntelligentTieringConfiguration
  • deleteBucketInventoryConfiguration
  • deleteBucketLifecycle
  • deleteBucketMetricsConfiguration
  • deleteBucketOwnershipControls
  • deleteBucketPolicy
  • deleteBucketReplication
  • deleteBucketTagging
  • deleteBucketWebsite
  • deleteObjectTagging
  • deletePublicAccessBlock
  • destroy
  • getBucketAccelerateConfiguration
  • getBucketAcl
  • getBucketAnalyticsConfiguration
  • getBucketCors
  • getBucketEncryption
  • getBucketIntelligentTieringConfiguration
  • getBucketInventoryConfiguration
  • getBucketLifecycleConfiguration
  • getBucketLocation
  • getBucketLogging
  • getBucketMetricsConfiguration
  • getBucketNotificationConfiguration
  • getBucketOwnershipControls
  • getBucketPolicy
  • getBucketPolicyStatus
  • getBucketReplication
  • getBucketRequestPayment
  • getBucketTagging
  • getBucketVersioning
  • getBucketWebsite
  • getObjectAcl
  • getObjectLegalHold
  • getObjectLockConfiguration
  • getObjectRetention
  • getObjectTagging
  • getPublicAccessBlock
  • headBucket
  • headObject
  • listBucketAnalyticsConfigurations
  • listBucketIntelligentTieringConfigurations
  • listBucketInventoryConfigurations
  • listBucketMetricsConfigurations
  • listBuckets
  • listMultipartUploads
  • listObjectVersions
  • listParts
  • putBucketAccelerateConfiguration
  • putBucketAcl
  • putBucketCors
  • putBucketEncryption
  • putBucketIntelligentTieringConfiguration
  • putBucketInventoryConfiguration
  • putBucketLifecycleConfiguration
  • putBucketLogging
  • putBucketMetricsConfiguration
  • putBucketNotificationConfiguration
  • putBucketOwnershipControls
  • putBucketPolicy
  • putBucketReplication
  • putBucketRequestPayment
  • putBucketTagging
  • putBucketVersioning
  • putBucketWebsite
  • putObjectAcl
  • putObjectLegalHold
  • putObjectLockConfiguration
  • putObjectRetention
  • putObjectTagging
  • putPublicAccessBlock
  • restoreObject
  • selectObjectContent
  • send
  • uploadPart
  • uploadPartCopy
  • writeGetObjectResponse
  • AbortMultipartUploadCommand
  • CompleteMultipartUploadCommand
  • CreateMultipartUploadCommand
  • DeleteBucketAnalyticsConfigurationCommand
  • DeleteBucketCorsCommand
  • DeleteBucketEncryptionCommand
  • DeleteBucketIntelligentTieringConfigurationCommand
  • DeleteBucketInventoryConfigurationCommand
  • DeleteBucketLifecycleCommand
  • DeleteBucketMetricsConfigurationCommand
  • DeleteBucketOwnershipControlsCommand
  • DeleteBucketPolicyCommand
  • DeleteBucketReplicationCommand
  • DeleteBucketTaggingCommand
  • GetBucketAccelerateConfigurationCommand
  • GetBucketAclCommand
  • DeleteBucketWebsiteCommand
  • DeleteObjectTaggingCommand
  • DeletePublicAccessBlockCommand
  • GetBucketAnalyticsConfigurationCommand
  • GetBucketCorsCommand
  • GetBucketEncryptionCommand
  • GetBucketIntelligentTieringConfigurationCommand
  • GetBucketInventoryConfigurationCommand
  • GetBucketLifecycleConfigurationCommand
  • GetBucketLocationCommand
  • GetBucketLoggingCommand
  • GetBucketMetricsConfigurationCommand
  • GetBucketNotificationConfigurationCommand
  • GetBucketOwnershipControlsCommand
  • GetBucketPolicyCommand
  • GetBucketPolicyStatusCommand
  • GetBucketReplicationCommand
  • GetBucketRequestPaymentCommand
  • GetBucketTaggingCommand
  • GetBucketVersioningCommand
  • GetBucketWebsiteCommand
  • GetObjectAclCommand
  • GetObjectLegalHoldCommand
  • GetObjectLockConfigurationCommand
  • GetObjectRetentionCommand
  • GetObjectTaggingCommand
  • GetPublicAccessBlockCommand
  • HeadBucketCommand
  • HeadObjectCommand
  • ListBucketAnalyticsConfigurationsCommand
  • ListBucketIntelligentTieringConfigurationsCommand
  • ListBucketInventoryConfigurationsCommand
  • ListBucketMetricsConfigurationsCommand
  • ListMultipartUploadsCommand
  • ListObjectVersionsCommand
  • ListPartsCommand
  • PutBucketAccelerateConfigurationCommand
  • PutBucketAclCommand
  • PutBucketAnalyticsConfigurationCommand
  • PutBucketCorsCommand
  • PutBucketEncryptionCommand
  • PutBucketIntelligentTieringConfigurationCommand
  • PutBucketInventoryConfigurationCommand
  • PutBucketLifecycleConfigurationCommand
  • PutBucketLoggingCommand
  • PutBucketMetricsConfigurationCommand
  • PutBucketNotificationConfigurationCommand
  • PutBucketOwnershipControlsCommand
  • PutBucketPolicyCommand
  • PutBucketReplicationCommand
  • PutBucketRequestPaymentCommand
  • PutBucketTaggingCommand
  • PutBucketVersioningCommand
  • PutBucketWebsiteCommand
  • PutObjectAclCommand
  • PutObjectLegalHoldCommand
  • PutObjectLockConfigurationCommand
  • PutObjectRetentionCommand
  • PutObjectTaggingCommand
  • PutPublicAccessBlockCommand
  • UploadPartCommand
  • UploadPartCopyCommand
  • WriteGetObjectResponseCommand
useInsertLink
  • putObject
  • copyObject
  • putObject
  • copyObject
  • RestoreObjectCommand
  • PutObjectCommand
  • CopyObjectCommand
useDeleteLink
  • deleteBucket
  • deleteObject

  • deleteObjects

  • deleteBucket
  • deleteObject

  • deleteObjects

  • DeleteBucketCommand
  • DeleteObjectCommand
  • DeleteObjectsCommand
useSelectLink
  • getObject
  • getObjectTorrent
  • listObjects

  • listObjectsV2

  • copyObject
  • getObject
  • getObjectTorrent
  • listObjects
  • listObjectsV2
  • copyObject
  • GetObjectCommand
  • ListObjectsCommand
  • ListObjectsV2Command
  • SelectObjectContentCommand
  • GetObjectTorrentCommand
  • CopyObjectCommand
useUpdateLink
  • putBucketAnalyticsConfiguration
  • putBucketAnalyticsConfiguration
  • RestoreObjectCommand
  • PutObjectCommand
  • CopyObjectCommand

Code samples sdk

This code will create a S3 Bucket named "BucketTest1" on an AWS server

//import  { SNSClient } from "@aws-sdk/client-sns";
// ES5 example
const {S3Client} = require("@aws-sdk/client-s3");
// Set the AWS Region.
const REGION = "us-east-1";
// Create an Amazon S3 service client object.
const s3Client = new S3Client({ region: REGION });
export { s3Client };

const {CreateBucketCommand} = require("@aws-sdk/client-s3");

const {PutObjectCommand} = require("@aws-sdk/client-s3");

const {DeleteBucketCommand} = require("@aws-sdk/client-s3");

import {path} from "path";
import {fs} from "fs";

const file = "OBJECT_PATH_AND_NAME"; // Path to and name of object. For example '../myFiles/index.js'.
const fileStream = fs.createReadStream(file);

export const bucket = {
  Bucket: "BucketTest1",
  ACL : "public-read'"
};

// Create the Amazon S3 bucket.
export const runTest = async () => {
  try {
    const data = await s3Client.send(new CreateBucketCommand(bucket));
    console.log("Success", data);
    return data; // For unit tests.
  } catch (err) {
    console.log("Error", err);
  }
};
runTest();

export const uploadParams = {
  Bucket: "BucketTest1",
  // Add the required 'Key' parameter using the 'path' module.
  Key: path.basename(file),
  // Add the required 'Body' parameter
  Body: fileStream,
};

// Upload file to specified bucket.
export const runTestPut = async () => {
  try {
    const data = await s3Client.send(new PutObjectCommand(uploadParams));
    console.log("Success", data);
    return data; // For unit tests.
  } catch (err) {
    console.log("Error", err);
  }
};
runTestPut();

// Upload file to specified bucket.
export const runTestDelete = async () => {
  try {
    const data = await s3Client.send(new DeleteBucketCommand(bucket));
    console.log("Success", data);
    return data; // For unit tests.
  } catch (err) {
    console.log("Error", err);
  }
};
runTestDelete();

What results can you expect?

Once the analysis/snapshot generation has completed, you can view the results in the normal manner (for example via CAST Enlighten):

Support for SNS

The following APIs are supported:

For SDK V2

  • AWS.SNS({apiVersion: '2010-03-31'}).publish(params)
  • AWS.SNS({apiVersion: '2010-03-31'}).subscribe(params)

For the publish method a NodeJS AWS SNS Publisher object is created. Its name is that of the topic.

For the subscribe methods, a NodeJS AWS SNS Subscriber object is created. Its name is that of the topic. Then for each supported protocol, an object is created with a callLink from the subscriber to that object.


For SDK V3

  • new SNSClient({region: REGION}).send(new SubscribeCommand(params));
    new SNSClient({region: REGION}).send(new PublishCommand(params));

For the PublishCommandNodeJS AWS SNS Publisher object is created. Its name is that of the topic.

For the SubscribeCommand, a NodeJS AWS SNS Subscriber object is created. Its name is that of the topic. Then for each supported protocol, an object is created with a callLink from the subscriber to that object. 


The supported protocols are the following:

protocolobject createdname of the object
emailNodeJS Emailan Email   (the email addresses are not evaluated)
smsNodeJS SMSan SMS   (the SMS numbers are not evaluated)
http/httpsNodeJS AWS Post HttpRequers servicethe url (evaluated from the endpoint)
sqsNodeJS AWS Simple Queue Service Publisherthe name of the queue (evaluated from the endpoint)
lambdaNodeJS Call to AWS Lambda Functionthe name of the lambda function (evaluated from the endpoint)

the com.castsoftware.wbslinker will create a callLink between the SNS Publishers and SNS Subscribers which have the same name.

Example api v2

When analyzing the following source code:

var AWS = require('aws-sdk');
// Set region
AWS.config.update({region: 'REGION'});
// Create promise and SNS service object

var sns = new AWS.SNS({apiVersion: '2010-03-31'})

function my_subscribe(params) {
    sns.subscribe(params, function (err, data) {
        if (err) console.log(err, err.stack); // an error occurred
        else console.log(data);           // successful response
    });
}

function my_publish(params) {
    sns.publish(params);
}

function foo() {
    let topicArn = "arn:aws:sns:eu-west-3:757025016730:testTopic";
    my_subscribe({Protocol: "EMAIL", TopicArn: topicArn, Endpoint: "EMAIL_ADDRESS"})
    my_subscribe({Protocol: "SMS", TopicArn: topicArn, Endpoint: "911"})
    my_subscribe({Protocol: "LAMBDA", TopicArn: topicArn, Endpoint: "arn:aws:lambda:eu-west-3:757025016730:testLambda"})
    my_subscribe({Protocol: "HTTP", TopicArn: topicArn, Endpoint: "http:/myapi.api.test.com"})
 }
function bar() {
    const params2 = {
        TopicArn: "arn:aws:sns:eu-west-3:757025016730:testTopic",
        Message: "MESSAGE_TEXT"
    };
    my_publish(params2)
}

Example api v3

When analyzing the following source code:

const {SNSClient} = require("@aws-sdk/client-sns");
// Set the AWS Region.
const REGION = "us-east-1";
// Create SNS service object.
const snsClient = new SNSClient({region: REGION});
export {snsClient};

// ====== Import required AWS SDK clients and commands for Node.js
const {PublishCommand} = require("@aws-sdk/client-sns");
const {SubscribeCommand} = require("@aws-sdk/client-sns");

const subscription = async () => {
    try {
        let topicArn = "arn:aws:sns:eu-west-3:757025016730:testTopic";
        const data1 = await snsClient.send(new SubscribeCommand({Protocol: "EMAIL", TopicArn: topicArn, Endpoint: "test@mail.com"}));
        const data2 = await snsClient.send(new SubscribeCommand({Protocol: "SMS", TopicArn: topicArn, Endpoint: "911"}));
        const data3 = await snsClient.send(new SubscribeCommand({Protocol: "LAMBDA", TopicArn: topicArn, Endpoint: "arn:aws:lambda:eu-west-3:757025016730:testLambda"}));
        const data4 = await snsClient.send(new SubscribeCommand({Protocol: "HTTP", TopicArn: topicArn, Endpoint: "http:/myapi.api.test.com"}));
        console.log("Success.", data1);
        return data1;
    } catch (err) {
        console.log("Error", err.stack);
    }
};
subscription();

const run = async () => {
    // Set the parameters
    const params = {
        Message: "MESSAGE_TEXT", // MESSAGE_TEXT
        TopicArn: "arn:aws:sns:eu-west-3:757025016730:testTopic", //TOPIC_ARN
    };
    try {
        const data = await snsClient.send(new PublishCommand(params));
        console.log("Success.", data);
        return data; // For unit tests.
    } catch (err) {
        console.log("Error", err.stack);
    }
};
run();

Known limitations for AWS support

  • The use of AWS.SQS with promises is not supported. For instance no link would be created between the receiver and the handler function defined in .then() call in the following source code: 
sqs.receiveMessage(params).promise().then( () => {});
  • The use of AWS.SQS with send() is not supported. For instance no link would be created between the receiver and the handler function defined in .send() call in the following source code: 
var request = sqs.receiveMessage(params);
request.send(() => {});
  • Use of access points is not supported

Linking

The extension com.castsoftware.wbslinker is responsible for matching NodeJS Call to AWS Lambda Function objects to Lambda Function objects such as Java AWS Lambda Function during application-level analysis.

Support for AWS XRay

aws-xray encapsulates AWS methods calls in order to provide status and load status. However, the encapsulation did not allow the extension to provide objects and links. With the support of AWS XRay starting in 2.6.0-beta4, these objects and links will be created.

Code samples

This code will encapsulate AWS SDK then create a dynamoDB instance, and Document client instance.

// Copyright 2019 Amazon.com, Inc. or its affiliates. All Rights Reserved.
// SPDX-License-Identifier: MIT-0

// default imports
const AWSXRay = require('aws-xray-sdk-core')
const AWS = AWSXRay.captureAWS(require('aws-sdk')) // Encapsulate AWS SDK
const { metricScope, Unit } = require("aws-embedded-metrics")
const DDB = new AWS.DynamoDB({ apiVersion: "2012-10-08" }) // use AWS as usual
const { v1: uuidv1 } = require('uuid');

// environment variables
const { TABLE_NAME, ENDPOINT_OVERRIDE, REGION } = process.env
const options = { region: REGION }
AWS.config.update({ region: REGION })

if (ENDPOINT_OVERRIDE !== "") {
    options.endpoint = ENDPOINT_OVERRIDE
}

const docClient = new AWS.DynamoDB.DocumentClient(options)
// response helper
const response = (statusCode, body, additionalHeaders) => ({
    statusCode,
    body: JSON.stringify(body),
    headers: { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*', ...additionalHeaders },
})

function isValidRequest(context, event) {
    return (event.body !== null)
}

function getCognitoUsername(event){
    let authHeader = event.requestContext.authorizer;
    if (authHeader !== null)
    {
        return authHeader.claims["cognito:username"];
    }
    return null;

}

function addRecord(event) {

    let usernameField = {
        "cognito-username": getCognitoUsername(event)
    }

    // auto generated date fields
    let d = new Date()
    let dISO = d.toISOString()
    let auto_fields = {
        "id": uuidv1(),
        "creation_date": dISO,
        "lastupdate_date": dISO
    }

    //merge the json objects
    let item_body = {...usernameField, ...auto_fields, ...JSON.parse(event.body) }

    console.log(item_body);
    
    //final params to DynamoDB
    const params = {
        TableName: TABLE_NAME,
        Item: item_body
    }

    return docClient.put(params)
}

// Lambda Handler
exports.addToDoItem =
    metricScope(metrics =>
        async (event, context, callback) => {
            metrics.setNamespace('TodoApp')
            metrics.putDimensions({ Service: "addTodo" })
            metrics.setProperty("RequestId", context.requestId)

            if (!isValidRequest(context, event)) {
                metrics.putMetric("Error", 1, Unit.Count)
                return response(400, { message: "Error: Invalid request" })
            }

            try {
                let data = await addRecord(event).promise()
                metrics.putMetric("Success", 1, Unit.Count)
                return response(200, data)
            } catch (err) {
                metrics.putMetric("Error", 1, Unit.Count)
                return response(400, { message: err.message })
            }
        }
    )

What results can you expect?

Once the analysis/snapshot generation has completed, you can view the results in the normal manner (for example via CAST Enlighten):

Click to enlarge

Analysis of the code sample

Support for AWS Embedded Metrics

aws-embedded-metrics encapsulates functions to provide metrics. However, the encapsulation did not allow the extension to provide links. With the support of AWS Embedded Metrics starting in 2.6.0-beta4, these links will be create.

Code samples

This code will encapsulate lambda handler "addToDoItem".

// Copyright 2019 Amazon.com, Inc. or its affiliates. All Rights Reserved.
// SPDX-License-Identifier: MIT-0

// default imports
const AWSXRay = require('aws-xray-sdk-core')
const AWS = AWSXRay.captureAWS(require('aws-sdk')) // Encapsulate AWS SDK
const { metricScope, Unit } = require("aws-embedded-metrics")
const DDB = new AWS.DynamoDB({ apiVersion: "2012-10-08" }) // use AWS as usual
const { v1: uuidv1 } = require('uuid');

// environment variables
const { TABLE_NAME, ENDPOINT_OVERRIDE, REGION } = process.env
const options = { region: REGION }
AWS.config.update({ region: REGION })

if (ENDPOINT_OVERRIDE !== "") {
    options.endpoint = ENDPOINT_OVERRIDE
}

const docClient = new AWS.DynamoDB.DocumentClient(options)
// response helper
const response = (statusCode, body, additionalHeaders) => ({
    statusCode,
    body: JSON.stringify(body),
    headers: { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*', ...additionalHeaders },
})

function isValidRequest(context, event) {
    return (event.body !== null)
}

function getCognitoUsername(event){
    let authHeader = event.requestContext.authorizer;
    if (authHeader !== null)
    {
        return authHeader.claims["cognito:username"];
    }
    return null;

}

function addRecord(event) {

    let usernameField = {
        "cognito-username": getCognitoUsername(event)
    }

    // auto generated date fields
    let d = new Date()
    let dISO = d.toISOString()
    let auto_fields = {
        "id": uuidv1(),
        "creation_date": dISO,
        "lastupdate_date": dISO
    }

    //merge the json objects
    let item_body = {...usernameField, ...auto_fields, ...JSON.parse(event.body) }

    console.log(item_body);
    
    //final params to DynamoDB
    const params = {
        TableName: TABLE_NAME,
        Item: item_body
    }

    return docClient.put(params)
}

// Lambda Handler
exports.addToDoItem =
    metricScope(metrics =>
        async (event, context, callback) => {
            metrics.setNamespace('TodoApp')
            metrics.putDimensions({ Service: "addTodo" })
            metrics.setProperty("RequestId", context.requestId)

            if (!isValidRequest(context, event)) {
                metrics.putMetric("Error", 1, Unit.Count)
                return response(400, { message: "Error: Invalid request" })
            }

            try {
                let data = await addRecord(event).promise()
                metrics.putMetric("Success", 1, Unit.Count)
                return response(200, data)
            } catch (err) {
                metrics.putMetric("Error", 1, Unit.Count)
                return response(400, { message: err.message })
            }
        }
    )

What results can you expect?

Once the analysis/snapshot generation has completed, you can view the results in the normal manner (for example via CAST Enlighten)

Analysis of the code sample

Call to Program

NodeJS extension now supports call to external programs using the child-process module.

The fork() function is not handled as its only purpose is to fork node.js programs.

These declaration create a call to a java program/JAR file

const exec = require('child_process').exec;

exec('java -cp com.castsoftware.Archive -jar jarFile.jar', (e, stdout, stderr) => {
    if (e instanceof Error) {
        console.error(e);
        throw e;
    }

    console.log('stdout ', stdout);
    console.log('stderr ', stderr);
});

const cp = require('child_process');
const class_name = 'com.castsoftware.Foo'

function call_foo(req, resp) {
    const args = [
        '-cp',
        '/bin',
        class_name
    ];
    const proc = cp.spawn('java', args);
}

These declarations creates a call to a Python Program

const execFile = require('child_process').execFile;
const python_file = 'app.py'

const child = execFile('python', [python_file], (error, stdout, stderr) => {

    if (error) {
        console.error('stderr', stderr);
        throw error;
    }
    console.log('stdout', stdout);
});

Restify

NodeJS extension now supports routing using the restify module. The following is an example of application using restify to handle some URIs.

var restify = require('restify');

function send(req, res, next) {

  res.send('hello ' + req.params.name);
  next();
}

var server = restify.createServer();

server.post('/hello', function create(req, res, next) {

  res.send(201, Math.random().toString(36).substr(3, 8));
  return next();

});

server.put('/hello', send);
server.get('/hello/:name', function create(req, res, next) {

  res.send(201, Math.random().toString(36).substr(3, 8));
  return next();

},send);
server.head('/hello/:name', send);

server.del('hello/:name', function rm(req, res, next) {

  res.send(204);
  return next();

});

server.listen(8080, function() {

  console.log('%s listening at %s', server.name, server.url);

});

SQL Named Query 

When executing an sql query directly, a CAST SQL NamedQuery object will be created:

var oracledb = require('oracledb');

connection = oracledb.getConnection(
  {
    user          : "hr",
    password      : "welcome",
    connectString : "localhost/XE"
  }
);

oracledb.getConnection(
  {
    user          : "hr",
    password      : "welcome",
    connectString : "localhost/XE"
  },
  function(err, connection)
  {
    if (err) { console.error(err); return; }
    connection.execute(
      "SELECT department_id, department_name "
    + "FROM titles "
    + "WHERE department_id < 70 "
    + "ORDER BY department_id",
      function(err, result)
      {
        if (err) { console.error(err); return; }
        console.log(result.rows);
      });
  });

Structural Rules

The following structural rules are provided:


Known Limitations

In this section we list the most significant functional limitations that may affect the analysis of applications using Node.js:

  • With regard to external links degraded mode, only statements with a FROM clause are correctly handled.
  • NodeJS objects are only supported for ES5 standard.
  • Analysis of AWS Lambda function needs have access to the serverless.yml file mapping routes and handlers together
  • Known limitations for AWS support are detailed in this section.