This documentation is not maintained. Please refer to doc.castsoftware.com/technologies to find the latest updates.

Summary: This document provides basic information about the extension Node.js + Express support for Web applications.

Extension ID

com.castsoftware.nodejs

What's new?

Please see Node.js - 2.5 - Release Notes for more information.

Description

This extension provides support for Node.jsNode.js is a JavaScript runtime built on Chrome's V8 JavaScript engine. Node.js uses an event-driven, non-blocking I/O model that makes it lightweight and efficient. 

CAST recommends using this extension with HTML5 and JavaScript ≥ 2.0.0 for the best results.

In what situation should you install this extension?

Regarding Front-End to Back-End connections, we do support the following cross-technology stacks:

iOS Front-End connected to Node.js/PostgreSQL Back-endiOS Front-End connected to Node.js/MSSQL Back-endAngularJS Front-End connected to Node.js/MongoDB Back-end

If your Web application contains Node.js source code and you want to view these object types and their links with other objects, then you should install this extension:

  • creates a Node.js application object when an instance has been found
  • creates Node.js operations which represent entry-points of web services

Express framework

Click here to expand...

The following declarations will create a Node.js Get Operation:

app.get('/login', function (req, res) {
    "use strict";
    console.log('login ' + req.url);
    console.log('login ' + req.query.pseudo);
    var currentSession = getSessionId(req, res);
    datab.userExists(currentSession, req.query.pseudo, res, cbLogin);
});

and this one will create a NodeJS Service Operation:

var admin = express();

app.use('/admin', admin);

Hapi.js framework

Click here to expand...

Create a server - index.js:

const Hapi = require('hapi');

// Create Server
const server = new Hapi.Server();

Routes: create a route for server:

server.route([     
	{
        method: 'GET',
        path: '/api/directors/{id}',
        handler: api.directors.get,
        config: {
            tags: ['api'],
            description: 'Get one director by id',
            notes: 'Get one director by id',
            validate: {
                params: {
                    id: Joi.number().required()
                }
            },
            cors: {
                origin: ['*']
            }
        }
    }
];

Sails.js framework

Click here to expand...

Create a server: app.js.

...
  // Start server
  sails.lift(rc('sails'));
...


Routes control at config/routes.js:

...
'GET /site/:idSite' : {controller: "Site", action: "getSite", rel: RelServices.REL_ENUM.GET_VIEWED_SITE},
...
'PUT /alert' : {controller: "Alert", action: "putAlert", rel: RelServices.REL_ENUM.PUT_ALERT, profile: ProfileServices.PROFILE_ENUM.OPERER},
...

Controller actions:

SiteController.js
...
self.getSite = function (req, res) {
  ...
  var promise = Site.findOne({
    idSite: idSite
  });
  ...
};


AlertController.js
...
self.putAlert = function (req, res) {
  ...
  var promise = Alert.findOne({
    alertId: alertId
  });
  ...
};

Model definition:

Site.js:
...
self.connection = 'postgresqlServer';

self.tableName = 'T_SITE';


self.attributes = {
...
}
...
Alert.js
...
self.connection = 'postgresqlServer';

self.tableName = 'T_ALERT';


self.attributes = {
...
}
...

Transaction from get operation method to database when using SQL analyzer:

Loopback framework

Click here to expand...

Create webservice from Express API

The App extends and supports Express Middleware. Webservice can be supported as API Express framework:

var loopback = require('loopback');
var app = loopback();

// Create get method
app.get('/', function(req,res){
res.send('hello wor;ld')
});

app.listen(3000);

Create webservice from model

Model todo.js:

module.exports = function(Todo) {
  Todo.stats = function(filter, cb) {
  ...
  };
  ...
  Todo.remoteMethod('stats', {
    accepts: {arg: 'filter', type: 'object'},
    returns: {arg: 'stats', type: 'object'},
    http: { path: '/stats' }
  }, Todo.stats);
  ...
}

Exposing models over REST: https://loopback.io/doc/en/lb3/Exposing-models-over-REST.html. LoopBack models automatically have a standard set of HTTP endpoints that provide REST APIs.

Example: todo.json:

{
  "name": "Todo",
  "base": "PersistedModel",
  "strict": "throw",
  "persisteUndefinedAsNull": true,
  "trackChanges": true,
  "properties": {
    "id": {
      "id": true,
      "type": "string",
      "defaultFn": "guid"
    },
    "title": "string",
    "completed": {
      "type": "boolean",
      "default": false
    },
    "created": {
      "type": "number"
    }
  }
}

 

Koa.js framework

Click here to expand...

Webservice application from Koa:

var koa = require('koa'),
  router = require('koa-router'),
  cors = require('koa-cors'),
  json = require('koa-json'),
  errorHandler = require('koa-onerror'),
  bodyParser = require('koa-body')(),
  app = koa(),
  routes = new router();


function render(controller, action) {
	...
}

/* routes start */

routes.get(  '/todos',                     render('todos',     'all'));
routes.post( '/todos',        bodyParser,  render('todos',     'create'));
routes.get(  '/todos/:id',                 render('todos',     'show'));
routes.del(  '/todos/:id',                 render('todos',     'delete'));
routes.patch('/todos/:id',    bodyParser,  render('todos',     'update'));
routes.del(  '/todos',                     render('todos',     'deleteAll'));


app.use(require('./app/middlewares/request_logger')());
app.use(json());
app.use(cors({methods: ['GET', 'PUT', 'POST', 'PATCH', 'DELETE']}));
app.use(routes.middleware());

errorHandler(app);

app.listen(Number(process.env.PORT || 9000));

Knex.js framework

Click here to expand...

Knex.js is a "batteries included" SQL query builder for Postgres, MSSQL, MySQL, MariaDB, SQLite3, Oracle, and Amazon Redshift designed to be flexible, portable, and fun to use. We do not support the creation of tables for this framework. Example:

Define database config:

const Config = require('../config');

module.exports = {
  client: 'postgresql',
  connection: Config.DATABASE_URL || {
    database: Config.DB_NAME,
    host: Config.DB_HOST,
    username: Config.DB_USER,
    password: Config.DB_PASSWORD
  }
};

Create bookshelf from Knex and Bookshelf:

const DatabaseConfig = require('../../db');

const Bookshelf = require('bookshelf');
const Knex      = require('knex')(DatabaseConfig);

module.exports = Bookshelf(Knex);


Add model for bookshelf:

const Bookshelf = require('../util/bookshelf');

const Config = require('../../config');

module.exports = Bookshelf.Model.extend({
  tableName: 'todos',
  url: function () {
    return `${Config.DOMAIN}/${this.get('id')}`;
  },
  serialize: function () {
    return {
      id: this.get('id'),
      title: this.get('title'),
      url: this.url(),
      completed: this.get('completed'),
      order: this.get('order'),
      object: 'todo'
    };
  }
});

Define method for model:

const Todo = require('../../models/todo');

exports.deleteAll = () => {
  // hack to get around Bookshelf's lacking destroyAll
  return new Todo().where('id', '!=', 0).destroy()
  .then(() => []);
};

Access model from webservice method:

exports.register = (server, options, next) => {

server.route([{
    method: 'DELETE',
    path: '/',
    config: {
      handler: (request, reply) => {
        reply(Controller.deleteAll());
      }
    }
  },
])
}

If table isn't found from external, an unknown database table will be created:

Node.js MQTT

Click here to expand...

Controller.js defines a publisher with a messager:

function openGarageDoor () {
  // can only open door if we're connected to mqtt and door isn't already open
  if (connected && garageState !== 'open') {
    // Ask the door to open
    client.publish('garage/open', 'true')
  }
}


function closeGarageDoor () {
  // can only close door if we're connected to mqtt and door isn't already closed
  if (connected && garageState !== 'closed') {
    // Ask the door to close
    client.publish('garage/close', 'true')
  }
}

garage.js defines a subscriber as:

client.on('connect', () => {
  client.subscribe('garage/open')
  client.subscribe('garage/close')

  // Inform controllers that garage is connected
  client.publish('garage/connected', 'true')
  sendStateUpdate()
})


client.on('message', (topic, message) => {
  console.log('received message %s %s', topic, message)
  switch (topic) {
    case 'garage/open':
      return handleOpenRequest(message)
    case 'garage/close':
      return handleCloseRequest(message)
  }
})

Node.js Seneca Microservice

Click here to expand...

Create a service:

web-app.js:

var seneca = require('seneca')()

seneca
  .use('user')
  .use('auth')
  .use('../lib/api.js')
  .client({port:10202,pin:{role:'offer',cmd:'*'}})
  .client({port:10201,pin:{role:'user',cmd:'*'}})


var app = express()

app.use( bodyParser.json() )
app.use( seneca.export('web') )
app.use( express.static('./public') )

app.listen(3000)

offer-service:

require('seneca')()
  .use('../lib/offer')
  .listen(10202)
  .ready(function(){
    this.act({role:'offer',cmd:'provide'},console.log)
  })

Define the api.js:

module.exports = function( options ) {
var seneca = this
var plugin = 'api'

seneca.add( {role:plugin, end:'offer'}, end_offer) 

function end_offer( args, done ) {
var user = args.req$.seneca.user || {}

this.act('role:offer,cmd:provide',{nick:user.nick},done)
}

seneca.act({role:'web', use:{
prefix:'/api/',
pin:{role:plugin,end:'*'},
map:{
'offer': { GET:true },
}
}})

return {name:plugin};
}

offer.js:

module.exports = function( options ) {
  var seneca = this
  var plugin = 'offer'


  seneca.add( {role:plugin, cmd:'provide'}, cmd_provide)
  

  function cmd_provide( args, done ) {
    if( args.nick ) return done(null,{product:'Apple'});

    return done(null,{product:'Orange'});
  }


  return {name:plugin};
}

When a service sends an action (seneca.act()):

Click to enlarge

Webservice RestAPI:

...

    seneca.act('role:web',{use:{
      prefix:'/product',
      pin:'role:api,product:*',
      startware: verify_token,
      map:{
        star: { 
          GET:true,
          alias:'/:id/star' 
        },
        handle_star:{
          PUT:true,
          DELETE:true,
          POST:true,
          alias:'/:id/star'
        }
      }
...

Click to enlarge

Supported Node.js versions

VersionSupportComment
v0.x(error)No longer supported
v4.x(tick)LTS
v5.x(tick)Based on Javascript ES6
v6.x(tick)Based on Javascript ES6

v7.x

(tick)Based on Javascript ES6
v8.x(tick)
v9.x(tick)
v10.x(tick)
v11.x(tick)
v12.x(tick) 
v13.x(tick)
v14.x(tick)

Function Point, Quality and Sizing support

This extension provides the following support:

  • Function Points (transactions): a green tick indicates that OMG Function Point counting and Transaction Risk Index are supported
  • Quality and Sizing: a green tick indicates that CAST can measure size and that a minimum set of Quality Rules exist
Function Points
(transactions)
(tick)
Quality and Sizing(tick)

Comparison with existing support for JavaScript in CAST AIP

CAST AIP has provided support for analyzing JavaScript via its JEE and .NET analyzers (provided out of box in CAST AIP) for some time now. The HTML5/JavaScript extension (on which the Node.js extension depends) also provides support for JavaScript but with a focus on web applications. CAST highly recommends that you use this extension if your Application contains JavaScript and more specifically if you want to analyze a web application, however you should take note of the following:

  • You should ensure that you configure the extension to NOT analyze the back end web client part of a .NET or JEE application.
  • You should ensure that you configure the extension to ONLY analyze the front end web application built with the HTML5/JavaScript that communicates with the back end web client part of a .NET or JEE application.
  • If the back end web client part of a .NET or JEE application is analyzed with the Node.js extension and with the native .NET/JEE analyzers, then your results will reflect this - there will be duplicate objects and links (i.e. from the analyzer and from the extension) therefore impacting results and creating erroneous Function Point data.

In CAST AIP 8.3.x support for analyzing JavaScript has been withdrawn from the JEE and .NET analyzers.

CAST AIP compatibility

This extension is compatible with:

CAST AIP release
Supported
8.3.x(tick)
8.2.x(tick)
8.1.x(tick)
8.0.x(tick)
7.3.4 and all higher 7.3.x releases(tick)

Supported DBMS servers

DBMSSupported?
CSS(tick)
Oracle(tick)
Microsoft SQL Server(error)

Prerequisites

(tick)An installation of any compatible release of CAST AIP (see table above)

Dependencies with other extensions

Some CAST extensions require the presence of other CAST extensions in order to function correctly. The Node.js extension requires that the following other CAST extensions are also installed:

Note that when using the CAST Extension Downloader to download the extension and the Manage Extensions interface in CAST Server Manager to install the extension, any dependent extensions are automatically downloaded and installed for you. You do not need to do anything.

Download and installation instructions

Please see:

The latest release status of this extension can be seen when downloading it from the CAST Extend server.

Packaging, delivering and analyzing your source code

Once the extension is downloaded and installed, you can nowpackage your source code and run an analysis. The process of packaging, delivering and analyzing your source code is described below:

Click here to expand...

Packaging and delivery

Note that the jQuery extension does not contain any CAST Delivery Manager Tool discoverers or extractors, therefore, no "jQuery" projects will be detected. However, the Web Files Discoverer extension will be automatically installed (it is a "shipped" extension which means it is delivered with AIP Core) and will automatically detect projects as HTML5 if specific files are delivered, therefore ensuring that Analysis Units are created for your source code.

Using CAST Console

Using CAST Management Studio

Click here to expand...
  • create a new Version
  • create a new Package for your Node.js source code using the Files on your file system option:

  • Define the root folder of your Application source code:

  • Run the Package action
  • Before delivering the source code, check the packaging results:
Without the Web Files Discover

If you are not using the Web Files Discoverer, the following will occur:

  • the CAST Delivery Manager Tool will not find any "projects" related to the Node.js application source code - this is the expected behaviour. However, if your Node.js related source code is part of a larger application (for example a JEE application), then other projects may be found during the package action (click to enlarge):

With the Web Files Discoverer

If you are using the Web Files Discoverer, the following will occur:

  • the CAST Delivery Manager Tool will automatically detect "HTML5 file projects" (see Web Files Discoverer for more technical information about how the discoverer works) related to the Node.js application source code. In addition, if your Node.js related source code is part of a larger application (for example a JEE application), then other projects may also be found during the package action (click to enlarge):

  • Deliver the Version

Analyzing

Using CAST Console

AIP Console exposes the technology configuration options once a version has been accepted/imported, or an analysis has been run. Click Universal Technology (3) in the Config (1) > Analysis (2) tab to display the available options for your Node.js source code:

Then choose the relevant Analysis Unit (1) to view the configuration:

Using the CAST Management Studio

Click here to expand...


  • Accept and deploy the Version in the CAST Management Studio.
Without the Web Files Discover

If you are not using the Web Files Discoverer, the following will occur:

  • No Analysis Units will be created automatically relating to the Node.js source code - this is the expected behaviour. However, if your Node.js related source code is part of a larger application (for example a JEE application), then other Analysis Units may be created automatically:

  • In the Current Version tab, add a new Analysis Unit specifically for your Node.js source code, selecting the Add new Universal Analysis Unit option:

  • Edit the new Analysis Unit and configure in the Source Settings tab:
    • a name for the Analysis Unit
    • ensure you tick the HTML5/JavaScript option (the Node.js extension depends on the HTML5 and JavaScript extension - and therefore the Universal Analyzer language for the AngularJS extension is set as HTML5/JavaScript)
    • define the location of the deployed Node.js source code (the CAST Management Studio will locate this automatically in the Deployment folder):

  • Run a test analysis on the Analysis Unit before you generate a new snapshot.
With the Web Files Discoverer

If you are using the Web Files Discoverer, the following will occur:

  • "HTML5" Analysis Units will be created automatically (see Web Files Discoverer for more technical information about how the discoverer works) related to the Node.js application source code. In addition, if your Node.js related source code is part of a larger application (for example a JEE application), then other Analysis Units may also be created:

  • There is nothing further to do, you can now run a test analysis on the Analysis Unit before you generate a new snapshot.

Analysis warning and error messages

Click here to expand...

Message ID
Message Type

Logged during

Impact
Remediation
Action
NODEJS-001WarningAnalysisAn internal issue occured when parsing a statement in a file. A part of a file was badly analyzed.

 

Contact CAST Technical Support

What results can you expect?

Once the analysis/snapshot generation has completed, you can view the results in the normal manner (for example via CAST Enlighten):

Node.js application with MongoDB data storage exposing web services

Objects

The following specific objects are displayed in CAST Enlighten:

IconDescription

Node.js Application

Node.js Port

Node.js Delete Operation Service
Node.js Get Operation Service
Node.js Post Operation Service
Node.js Put Operation Service

Node.js Service

Node.js Express Use

Node.js Express Controller

Node.js Get Http Request Service

Node.js Post Http Request Service

Node.js Put Http Request Service

Node.js Delete Http Request Service

Node.js Unknown Database

Node.js Collection

Node.js Memcached Connection

Node.js Memcached Value

Node.js Call to Java Program

Node.js Call to Generic Program

Node.js Restify Get Operation

Node.js Restify Post Operation

Node.js Restify Put Operation

Node.js Restify Delete Operation

Node.js AWS SQS Publisher

Node.js AWS SQS Receiver

Node.js AWS SQS Unknown Publisher

Node.js AWS SQS Unknown Receiver

Node.js S3 Region

Node.js S3 Bucket

Node.js AWS Lambda Get Operation Service

Node.js AWS Lambda Post Operation Service

Node.js AWS Lambda Put Operation Service

Node.js AWS Lambda Delete Operation Service

Node.js AWS Lambda Any Operation Service

Node.js AWS Lambda Function

NodeJS Unknown Database Table

Node.js Ecosystem

Node.js comes with numerous libraries and frameworks bringing data acces, web services calls, microservices architectures. This list contains all supported libraries:

LibraryCommentData AccessWeb Service
AWS.DynamoDBAmazon database access(tick)
AWS.S3Amazon storage service(tick)
AWS.LambdaAmazon routing solution
(tick)
CosmosDBMicrosoft Azure NoSQL Database solution(tick)
CouchdbCouchdb access(tick)
Couchdb-nanoCouchdb access(tick)
elasticsearchOpen-source search engine (tick)
ExpressNode.js application framework
(tick)
HapiNode.js application framework(tick)(tick)
KnexNode.js SQL query builder (tick)
KoaNode.js application framework(tick)
LoopbackNode.js application framework(tick)(tick)
MarklogicMarklogic access(tick)
MemcachedStorage framework(tick)
Mode-mongodb-nativeMongoDB access(tick)
Mongo-clientMongoDB access(tick)
MongooseMongoDB access(tick)
my_connectionMySQL access(tick)
Node-couchdbCouchdb access(tick)
oracledbOracle Database access(tick)
pgPostgreSQL access(tick)
SailsNode.js application framework(tick)(tick)

External link behavior

Behaviour is different depending on the version of CAST AIP you are using the extension with:

  • From 7.3.6, SQL queries are sent to the external links exactly like standard CAST AIP analyzers.
  • From 7.3.4 and before 7.3.6, a degraded mode takes place: The Node.js extension analyzes the FROM clause to retrieve table names, then sends the table names only to external links.
  • For all versions, if no links are found via external links, unresolved objects are created (with type CAST_NodeJS_Unknown_Database_Table).

Connector per RDBMS Vendor

Oracle "oracledb" connector

Connector "oracledb"
var oracledb = require('oracledb');
connection = oracledb.getConnection(
  {
    user          : "hr",
    password      : "welcome",
    connectString : "localhost/XE"
  }
);
connection.execute(
      "SELECT department_id, department_name FROM departments WHERE department_id < 70",
      function(err, result)
      {
        if (err) { console.error(err); return; }
        console.log(result.rows);
      }
  );

MS SQL "node-sqlserver" and "mssql" connectors

Connector "node-sqlserver"
var sql = require('node-sqlserver');
//
var connStr = "Driver={SQL Server Native Client 11.0};Server=myySqlDb,1433;Database=DB;UID=Henry;PWD=cat;";
var query = "SELECT * FROM GAData WHERE TestID = 17";
sql.open(connStr, function(err,conn){
    if(err){
        return console.error("Could not connect to sql: ", err);
    }
    conn.queryRaw("SELECT TOP 10 FirstName, LastName FROM authors", function (err, results) {
        if (err) {
            console.log("Error running query!");
            return;
        }
        for (var i = 0; i < results.rows.length; i++) {
            console.log("FirstName: " + results.rows[i][0] + " LastName: " + results.rows[i][1]);
        }
    });
});
var match = "%crombie%";
sql.query(conn_str, "SELECT FirstName, LastName FROM titles WHERE LastName LIKE ?", [match], function (err, results) { 
    for (var i = 0; i < results.length; i++) {
        console.log("FirstName: " + results[i].FirstName + " LastName: " + results[i].LastName);
    }
});
Connector "mssql"
var sql = require('mssql');
var config = {
    user: '...',
    password: '...',
    server: 'localhost', // You can use 'localhost\\instance' to connect to named instance 
    database: '...',
     
    options: {
        encrypt: true // Use this if you're on Windows Azure 
    }
}
  
var connection = new sql.Connection(config, function(err) {
    // ... error checks 
     
    // Query 
     
    var request = new sql.Request(connection); // or: var request = connection.request(); 
    request.query('select * from authors', function(err, recordset) {
        // ... error checks 
         
        console.dir(recordset);
    });
     
    // Stored Procedure 
     
    var request = new sql.Request(connection);
    request.input('input_parameter', sql.Int, 10);
    request.output('output_parameter', sql.VarChar(50));
    request.execute('procedure_name', function(err, recordsets, returnValue) {
        // ... error checks 
         
        console.dir(recordsets);
    });
     
});

PostgreSQL "pg" connector

Connector "pg"
var pg = require("pg");
var conString = "pg://operator:CastAIP@localhost:2280/postgres";
var client = new pg.Client(conString);
client.connect();
var querySchemas = client.query("select nspname from pg_catalog.pg_namespace");
querySchemas.on("row", function (row, result) {
    "use strict";
    result.addRow(row);
});
querySchemas.on("end", function (result) {
    "use strict";
    console.log(result.rows);
    client.end();
});

MySQL "my_connection" connector

Connector "my_connection"
var connection = require("my_connection");
connection.query('my_url', 
			function result_getCatLogDetails(getCatLogDetails_err, getCatLogDetails_rows, 
			getCatLogDetails_fields) {
		
				if (getCatLogDetails_err) {
			        logContent += '|ERROR'+";";
					logContent += getCatLogDetails_err.message+";";
					utils.logAppDetails(logContent);
			        deferred.reject(new Error(getCatLogDetails_err));
			    } else {
			        deferred.resolve(getCatLogDetails_rows);
			    }
			});

Connector per NoSQL Vendor

Even if we don't have NoSQL server side representation, we will create a client side representation based on the API access. Node.js analyzer will create links from Javascript functions to NoSQL "Database" or "Table" equivalents as follows:

Amazon Web Services (AWS)

Support for lambda

Click here to expand...

Behavior explanation

The Node.JS extension only creates objects for AWS Lambda with the "nodejs-xxx" runtime, as such the lambda is not created in the following code sample as its runtime is "java".

Code samples

This example comes from https://github.com/zanon-io/aws-serverless-demo. In serverless.yml:

service: my-serverless-demo
provider:
  name: aws
  runtime: nodejs4.3
  region: us-east-1
  iamRoleStatements:
    - Effect: "Allow"
      Action:
        - 'sdb:Select'
      Resource: "arn:aws:sdb:${self:provider.region}:*:domain/Weather"
functions:
  weather:
    handler: handler.currentTemperature
    events:
      - http: GET weather/temperature
      - http: POST weather/temperature
    memorySize: 128
    timeout: 10
  meanweather:
    handler: handler.meanTemperature
    events:
      - http:
          path: weather/temperature/mean
          method: get
    memorySize: 128
    timeout: 10
    runtime: java

In handler.js:

module.exports.currentTemperature = (event, context, callback) => {

  const response = {
    statusCode: 200,
    body: JSON.stringify({
      temperature: 30,
      locationId: event.id
    })
  };

  callback(null, response);
};

In index.html:

<!DOCTYPE html>
<html lang="en">
  <head>
    <meta charset="UTF-8">
    <title>Weather Info</title>
    <link href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/css/bootstrap.min.css" rel="stylesheet">
    <style TYPE="text/css">
      input[type=button] {
        margin: 10px 0px;
      }
      p {
        margin: 10px 0px;
      }
    </style>
  </head>
  <body>
    <div class="container">
      <div class="row">
        <div class="col-xs-offset-4 col-xs-4 col-sm-offset-4 col-sm-3 col-md-offset-5 col-md-2 pagination-centered text-center">
          <h3>Daily Weather</h3>
          <input id="btn-show" type="button" class="btn btn-primary" value='Show Current'>
        </div>
      </div>
      <div class="row">
        <div class="col-xs-offset-4 col-xs-4 col-sm-offset-4 col-sm-3 col-md-offset-5 col-md-2">
          <p>Value: <span id="weather-value"></span></p>
          <a href="http://zanon.io/posts/building-serverless-websites-on-aws-tutorial"><p>source</p></a>
        </div>
      </div>
    </div>
    <script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.2.0/jquery.min.js"></script>
    <script src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/js/bootstrap.min.js"></script>
    <script>
      $(document).ready(function() {

        $('#btn-show').on('click', function() {

            $.ajax({
              url: "https://8w8ctjxkeh.execute-api.us-east-1.amazonaws.com/dev/weather/temperature?id=5", // Here is the GET request
              success: function(json) {
                $("#weather-value").text(json.temperature + ' ºC').fadeOut('slow').fadeIn('slow');
              }
            });
        });
      });
    </script>
  </body>
</html>

What results can you expect?

Once the analysis/snapshot generation has completed, you can view the results in the normal manner (for example via CAST Enlighten):

Click to enlarge


Support for SQS

Link TypeFunction
callLink
  • sendMessage

  • sendMessageBatch
  • receiveMessage

Code samples

This code will publish a message into the "SQS_QUEUE_URL" queue:

// Load the AWS SDK for Node.js
var AWS = require('aws-sdk');
// Set the region 
AWS.config.update({region: 'REGION'});

// Create an SQS service object
var sqs = new AWS.SQS({apiVersion: '2012-11-05'});

var params = {
   // Remove DelaySeconds parameter and value for FIFO queues
  DelaySeconds: 10,
  MessageAttributes: {
    "Title": {
      DataType: "String",
      StringValue: "The Whistler"
    },
    "Author": {
      DataType: "String",
      StringValue: "John Grisham"
    },
    "WeeksOn": {
      DataType: "Number",
      StringValue: "6"
    }
  },
  MessageBody: "Information about current NY Times fiction bestseller for week of 12/11/2016.",
  // MessageDeduplicationId: "TheWhistler",  // Required for FIFO queues
  // MessageGroupId: "Group1",  // Required for FIFO queues
  QueueUrl: "SQS_QUEUE_URL"
};

sqs.sendMessage(params, function(err, data) {
  if (err) {
    console.log("Error", err);
  } else {
    console.log("Success", data.MessageId);
  }
});

This code will receive a message from the queue "SQS_QUEUE_URL":

// Load the AWS SDK for Node.js
var AWS = require('aws-sdk');
// Set the region 
AWS.config.update({region: 'REGION'});

// Create an SQS service object
var sqs = new AWS.SQS({apiVersion: '2012-11-05'});

var params = {
   // Remove DelaySeconds parameter and value for FIFO queues
  DelaySeconds: 10,
  MessageAttributes: {
    "Title": {
      DataType: "String",
      StringValue: "The Whistler"
    },
    "Author": {
      DataType: "String",
      StringValue: "John Grisham"
    },
    "WeeksOn": {
      DataType: "Number",
      StringValue: "6"
    }
  },
  MessageBody: "Information about current NY Times fiction bestseller for week of 12/11/2016.",
  // MessageDeduplicationId: "TheWhistler",  // Required for FIFO queues
  // MessageGroupId: "Group1",  // Required for FIFO queues
  QueueUrl: "SQS_QUEUE_URL"
};

sqs.sendMessage(params, function(err, data) {
  if (err) {
    console.log("Error", err);
  } else {
    console.log("Success", data.MessageId);
  }
});

What results can you expect?

Once the analysis/snapshot generation has completed, you can view the results in the normal manner (for example via CAST Enlighten):

Click to enlarge

When the evaluation of the queue name fails, a Node.js AWS SQS Unknown Publisher (or Receiver) will be created.

Support for AWS S3

Click here to expand...

Links

Link TypeFunction
No Link
  • createBucket
callLink
  • createMultipartUpload

  • createPresignedPost

  • abortMultipartUpload

  • completeMultipartUpload

  • deleteBucketAnalyticsConfiguration

  • deleteBucketCors

  • deleteBucketEncryption

  • deleteBucketInventoryConfiguration

  • deleteBucketLifecycle

  • deleteBucketMetricsConfiguration

  • deleteBucketPolicy

  • deleteBucketReplication

  • deleteBucketTagging

  • deleteBucketWebsite

  • deleteObjectTagging

  • deletePublicAccessBlock

  • getBucketAccelerateConfiguration

  • getBucketAcl

  • getBucketAnalyticsConfiguration

  • getBucketCors

  • getBucketEncryption

  • getBucketInventoryConfiguration

  • getBucketLifecycle

  • getBucketLifecycleConfiguration

  • getBucketLocation

  • getBucketLogging

  • getBucketMetricsConfiguration

  • getBucketNotification

  • getBucketNotificationConfiguration

  • getBucketPolicy

  • getBucketPolicyStatus

  • getBucketReplication

  • getBucketTagging

  • getBucketVersioning

  • getBucketWebsite

  • getObjectAcl

  • getObjectLegalHold

  • getObjectLockConfiguration

  • getObjectRetention

  • getObjectTagging

  • getPublicAccessBlock

  • getSignedUrl

  • listBuckets
  • listBucketAnalyticsConfigurations

  • listBucketInventoryConfigurations

  • listBucketMetricsConfigurations

  • listMultipartUploads

  • listObjectVersions

  • listParts

  • putBucketLogging
  • putBucketAnalyticsConfiguration
  • putBucketLifecycleConfiguration

  • putBucketMetricsConfiguration

  • putBucketNotification

  • putBucketNotificationConfiguration

  • putBucketPolicy

  • putBucketReplication

  • putBucketRequestPayment

  • putBucketTagging

  • putBucketVersioning

  • putObjectAcl

  • putObjectLegalHold

  • putObjectLockConfiguration

  • putObjectRetention

  • putObjectTagging

  • putPublicAccessBlock

  • putBucketAccelerateConfiguration

  • putBucketAcl

  • putBucketCors

  • putBucketEncryption

  • putBucketInventoryConfiguration

  • putBucketLifecycle

  • upload

  • uploadPart

  • uploadPartCopy

useInsertLink
  • putObject
useDeleteLink
  • deleteBucket
  • deleteObject

  • deleteObjects

useSelectLink
  • getObject
  • getObjectTorrent
  • listObjects

  • listObjectsV2

useUpdateLink
  • putBucketLogging
  • putBucketAnalyticsConfiguration

Code samples

This code will create a S3 Bucket named "MyBucket" on an AWS server in region "REGION" and puts an object in it

// Load the AWS SDK for Node.js
var AWS = require('aws-sdk');
// Set the region 
AWS.config.update({region: 'REGION'});

// Create S3 service object
s3 = new AWS.S3({apiVersion: '2006-03-01'});

// Create the parameters for calling createBucket
var bucketParams = {
  Bucket : "MyBucket",
  ACL : 'public-read'
};

// call S3 to create the bucket
s3.createBucket(bucketParams, function(err, data) {
  if (err) {
    console.log("Error", err);
  } else {
    console.log("Success", data.Location);
  }
});

params = {
	// ...
    Bucket: "MyBucket"
};
s3.putObject(params, function(err, data) {
    if (err) console.log(err, err.stack); // an error occurred
    else     console.log(data);           // successful response
});

What results can you expect?

Once the analysis/snapshot generation has completed, you can view the results in the normal manner (for example via CAST Enlighten):

Analysis of the code sample



Known limitations for AWS support

  • The use of AWS.SQS with promises is not supported. For instance no link would be created between the receiver and the handler function defined in .then() call in the following source code: 
sqs.receiveMessage(params).promise().then( () => {});
  • The use of AWS.SQS with send() is not supported. For instance no link would be created between the receiver and the handler function defined in .send() call in the following source code: 
var request = sqs.receiveMessage(params);
request.send(() => {});

Call to Program

NodeJS extension now supports call to external programs using the child-process module.

The fork() function is not handled as its only purpose is to fork node.js programs.

These declaration create a call to a java program/JAR file

const exec = require('child_process').exec;

exec('java -cp com.castsoftware.Archive -jar jarFile.jar', (e, stdout, stderr) => {
    if (e instanceof Error) {
        console.error(e);
        throw e;
    }

    console.log('stdout ', stdout);
    console.log('stderr ', stderr);
});

const cp = require('child_process');
const class_name = 'com.castsoftware.Foo'

function call_foo(req, resp) {
    const args = [
        '-cp',
        '/bin',
        class_name
    ];
    const proc = cp.spawn('java', args);
}

These declarations creates a call to a Python Program

const execFile = require('child_process').execFile;
const python_file = 'app.py'

const child = execFile('python', [python_file], (error, stdout, stderr) => {

    if (error) {
        console.error('stderr', stderr);
        throw error;
    }
    console.log('stdout', stdout);
});

Restify

NodeJS extension now supports routing using the restify module.

The following is an example of application using restify to handle some URIs.

var restify = require('restify');

function send(req, res, next) {

  res.send('hello ' + req.params.name);
  next();
}

var server = restify.createServer();

server.post('/hello', function create(req, res, next) {

  res.send(201, Math.random().toString(36).substr(3, 8));
  return next();

});

server.put('/hello', send);
server.get('/hello/:name', function create(req, res, next) {

  res.send(201, Math.random().toString(36).substr(3, 8));
  return next();

},send);
server.head('/hello/:name', send);

server.del('hello/:name', function rm(req, res, next) {

  res.send(204);
  return next();

});

server.listen(8080, function() {

  console.log('%s listening at %s', server.name, server.url);

});

SQL Named Query 

When executing an sql query directly, a CAST SQL NamedQuery object will be created:

var oracledb = require('oracledb');

connection = oracledb.getConnection(
  {
    user          : "hr",
    password      : "welcome",
    connectString : "localhost/XE"
  }
);

oracledb.getConnection(
  {
    user          : "hr",
    password      : "welcome",
    connectString : "localhost/XE"
  },
  function(err, connection)
  {
    if (err) { console.error(err); return; }
    connection.execute(
      "SELECT department_id, department_name "
    + "FROM titles "
    + "WHERE department_id < 70 "
    + "ORDER BY department_id",
      function(err, result)
      {
        if (err) { console.error(err); return; }
        console.log(result.rows);
      });
  });

Structural Rules

The following structural rules are provided:

Known Limitations

In this section we list the most significant functional limitations that may affect the analysis of applications using Node.js:

  • With regard to external links degraded mode, only statements with a FROM clause are correctly handled.
  • NodeJS objects are only supported for ES5 standard.
  • Analysis of AWS Lambda function needs have access to the serverless.yml file mapping routes and handlers together
  • Known limitations for AWS support are detailled in this section