- Extension ID
- What's new?
- Description
- In what situation should you install this extension?
- Supported Node.js versions
- Node.js Ecosystem
- Function Point, Quality and Sizing support
- Comparison with existing support for JavaScript
- AIP Core compatibility
- Supported DBMS servers
- Prerequisites
- Dependencies with other extensions
- Download and installation instructions
- Packaging, delivering and analyzing your source code
- What results can you expect?
- Linking
- Known Limitations
Summary: This document provides basic information about the extension Node.js + Express support for Web applications.
Extension ID
com.castsoftware.nodejs
What's new?
Please see Node.js - 2.7 - Release Notes for more information.
Description
This extension provides support for Node.js. Node.js is a JavaScript runtime built on Chrome's V8 JavaScript engine. Node.js uses an event-driven, non-blocking I/O model that makes it lightweight and efficient.
In what situation should you install this extension?
Regarding Front-End to Back-End connections, we do support the following cross-technology stacks:
iOS Front-End connected to Node.js/PostgreSQL Back-end | iOS Front-End connected to Node.js/MSSQL Back-end | AngularJS Front-End connected to Node.js/MongoDB Back-end |
If your Web application contains Node.js source code and you want to view these object types and their links with other objects, then you should install this extension:
- creates a Node.js application object when an instance has been found
- creates Node.js operations which represent entry-points of web services
Node.js operations are called from client applications, using jQuery Ajax for example. Supported client frameworks are:
Express framework
Hapi.js framework
Sails.js framework
Loopback framework
Koa.js framework
Knex.js framework
Node.js MQTT
Node.js Seneca Microservice
Supported Node.js versions
Version | Support | Comment |
---|---|---|
v0.x | No longer supported | |
v4.x | LTS | |
v5.x | Based on Javascript ES6 | |
v6.x | Based on Javascript ES6 | |
v7.x | Based on Javascript ES6 | |
v8.x | ||
v9.x | ||
v10.x | ||
v11.x | ||
v12.x | ||
v13.x | ||
v14.x | ||
v15.x | ||
v16.x | ||
v17.x |
Node.js Ecosystem
Node.js comes with numerous libraries and frameworks bringing data access, web services calls, microservices architectures. This list contains all supported libraries:
Library | Comment | Data Access | Web Service | Messaging |
---|---|---|---|---|
AWS.DynamoDB | Amazon database access | |||
AWS.S3 | Amazon storage service | |||
AWS.SQS | Amazon messaging service | |||
AWS.Lambda | Amazon routing solution | |||
CosmosDB | Microsoft Azure NoSQL Database solution | |||
Couchdb | Couchdb access | |||
Couchdb-nano | Couchdb access | |||
elasticsearch | Open-source search engine | |||
Express | Node.js application framework | |||
Hapi | Node.js application framework | |||
Knex | Node.js SQL query builder | |||
Koa | Node.js application framework | |||
Loopback | Node.js application framework | |||
Marklogic | Marklogic access | |||
Memcached | Storage framework | |||
Mode-mongodb-native | MongoDB access | |||
Mongo-client | MongoDB access | |||
Mongoose | MongoDB access | |||
MQTT | Messaging library | |||
mssql | SQL server | |||
my_connection | MySQL access | |||
myssql | Nodejs module to manipulate MySQL database | |||
Node-couchdb | Couchdb access | |||
node-sqlserver | SQL server | |||
oracledb | Oracle Database access | |||
pg | PostgreSQL access | |||
redis | MongoDB access | |||
Sails | Node.js application framework | |||
Seneca | Microservice toolkit |
Function Point, Quality and Sizing support
- Function Points (transactions): a green tick indicates that OMG Function Point counting and Transaction Risk Index are supported
- Quality and Sizing: a green tick indicates that CAST can measure size and that a minimum set of Quality Rules exist
Function Points (transactions) | |
---|---|
Quality and Sizing |
Comparison with existing support for JavaScript
CAST AIP has provided support for analyzing JavaScript via its JEE and .NET analyzers (provided out of box in CAST AIP) for some time now. The HTML5/JavaScript extension (on which the Node.js extension depends) also provides support for JavaScript but with a focus on web applications. CAST highly recommends that you use this extension if your Application contains JavaScript and more specifically if you want to analyze a web application, however you should take note of the following:
- You should ensure that you configure the extension to NOT analyze the back end web client part of a .NET or JEE application.
- You should ensure that you configure the extension to ONLY analyze the front end web application built with the HTML5/JavaScript that communicates with the back end web client part of a .NET or JEE application.
- If the back end web client part of a .NET or JEE application is analyzed with the Node.js extension and with the native .NET/JEE analyzers, then your results will reflect this - there will be duplicate objects and links (i.e. from the analyzer and from the extension) therefore impacting results and creating erroneous Function Point data.
In CAST AIP ≥ 8.3.x support for analyzing JavaScript has been withdrawn from the JEE and .NET analyzers.
AIP Core release | Supported |
---|---|
8.3.x |
Supported DBMS servers
DBMS | Supported? |
---|---|
CSS / PostgreSQL |
Prerequisites
An installation of any compatible release of AIP Core (see table above) |
Dependencies with other extensions
Some CAST extensions require the presence of other CAST extensions in order to function correctly. The Node.js extension requires that the following other CAST extensions are also installed:
- HTML5/JavaScript
- Web services linker service (internal technical extension)
Download and installation instructions
The extension will be automatically downloaded and installed in CAST Console. You can manage the extension using the Application - Extensions interface:
Packaging, delivering and analyzing your source code
Once the extension is downloaded and installed, you can now package your source code and run an analysis. The process of packaging, delivering and analyzing your source code is described below:
What results can you expect?
Once the analysis/snapshot generation has completed, you can view the results in the normal manner (for example via CAST Enlighten):
Node.js application with MongoDB data storage exposing web services
Objects
The following specific objects are displayed in CAST Enlighten:
Icon | Description |
---|---|
Node.js Application | |
Node.js Port | |
Node.js Delete Operation Service | |
Node.js Get Operation Service | |
Node.js Post Operation Service | |
Node.js Put Operation Service | |
Node.js Service | |
Node.js Express Use | |
Node.js Express Controller | |
Node.js Get Http Request Service | |
Node.js Post Http Request Service | |
Node.js Put Http Request Service | |
Node.js Delete Http Request Service | |
Node.js Unknown Database | |
Node.js Collection | |
Node.js Memcached Connection | |
Node.js Memcached Value | |
Node.js Call to Java Program | |
Node.js Call to Generic Program | |
Node.js Restify Get Operation | |
Node.js Restify Post Operation | |
Node.js Restify Put Operation | |
Node.js Restify Delete Operation | |
Node.js AWS SQS Publisher | |
Node.js AWS SQS Receiver | |
Node.js AWS SQS Unknown Publisher | |
Node.js AWS SQS Unknown Receiver | |
Node.js AWS SNS Publisher | |
Node.js AWS SNS Subscriber | |
Node.js AWS SNS Unknown Publisher | |
Node.js AWS SNS Unknown Subscriber | |
Node.js AWS call to Lamba Function | |
Node.js AWS call to unknown Lambda Function | |
NodeJS Unknown Database Table |
External link behavior
Behaviour is different depending on the version of CAST AIP you are using the extension with:
- From 7.3.6, SQL queries are sent to the external links exactly like standard CAST AIP analyzers.
- From 7.3.4 and before 7.3.6, a degraded mode takes place: The Node.js extension analyzes the FROM clause to retrieve table names, then sends the table names only to external links.
- For all versions, if no links are found via external links, unresolved objects are created (with type CAST_NodeJS_Unknown_Database_Table).
Connector per RDBMS Vendor
Oracle "oracledb" connector
var oracledb = require('oracledb'); connection = oracledb.getConnection( { user : "hr", password : "welcome", connectString : "localhost/XE" } ); connection.execute( "SELECT department_id, department_name FROM departments WHERE department_id < 70", function(err, result) { if (err) { console.error(err); return; } console.log(result.rows); } );
MS SQL "node-sqlserver" and "mssql" connectors
var sql = require('node-sqlserver'); // var connStr = "Driver={SQL Server Native Client 11.0};Server=myySqlDb,1433;Database=DB;UID=Henry;PWD=cat;"; var query = "SELECT * FROM GAData WHERE TestID = 17"; sql.open(connStr, function(err,conn){ if(err){ return console.error("Could not connect to sql: ", err); } conn.queryRaw("SELECT TOP 10 FirstName, LastName FROM authors", function (err, results) { if (err) { console.log("Error running query!"); return; } for (var i = 0; i < results.rows.length; i++) { console.log("FirstName: " + results.rows[i][0] + " LastName: " + results.rows[i][1]); } }); }); var match = "%crombie%"; sql.query(conn_str, "SELECT FirstName, LastName FROM titles WHERE LastName LIKE ?", [match], function (err, results) { for (var i = 0; i < results.length; i++) { console.log("FirstName: " + results[i].FirstName + " LastName: " + results[i].LastName); } });
var sql = require('mssql'); var config = { user: '...', password: '...', server: 'localhost', // You can use 'localhost\\instance' to connect to named instance database: '...', options: { encrypt: true // Use this if you're on Windows Azure } } var connection = new sql.Connection(config, function(err) { // ... error checks // Query var request = new sql.Request(connection); // or: var request = connection.request(); request.query('select * from authors', function(err, recordset) { // ... error checks console.dir(recordset); }); // Stored Procedure var request = new sql.Request(connection); request.input('input_parameter', sql.Int, 10); request.output('output_parameter', sql.VarChar(50)); request.execute('procedure_name', function(err, recordsets, returnValue) { // ... error checks console.dir(recordsets); }); });
PostgreSQL "pg" connector
var pg = require("pg"); var conString = "pg://operator:CastAIP@localhost:2280/postgres"; var client = new pg.Client(conString); client.connect(); var querySchemas = client.query("select nspname from pg_catalog.pg_namespace"); querySchemas.on("row", function (row, result) { "use strict"; result.addRow(row); }); querySchemas.on("end", function (result) { "use strict"; console.log(result.rows); client.end(); });
MySQL "my_connection" connector
var connection = require("my_connection"); connection.query('my_url', function result_getCatLogDetails(getCatLogDetails_err, getCatLogDetails_rows, getCatLogDetails_fields) { if (getCatLogDetails_err) { logContent += '|ERROR'+";"; logContent += getCatLogDetails_err.message+";"; utils.logAppDetails(logContent); deferred.reject(new Error(getCatLogDetails_err)); } else { deferred.resolve(getCatLogDetails_rows); } });
Connector per NoSQL Vendor
Even if we don't have NoSQL server side representation, we will create a client side representation based on the API access. Node.js analyzer will create links from Javascript functions to NoSQL "Database" or "Table" equivalents as follows:
Azure Cosmos DB | See Azure Cosmos DB support for Node.js source code (for com.castsoftware.nodejs versions < 2.9). |
---|---|
CouchDB | See CouchDB support for Node.js source code. |
DynamoDB | See DynamoDB support for Node.js source code. |
Elasticsearch | See Elasticsearch support for Node.js source code. |
MarkLogic | See MarkLogic support for Node.js source code. |
Memcached | See Memcached support for Node.js source code. |
MongoDB "mongoose" | See MongoDB support for Node.js source code. |
Redis | See MongoDB support for Node.js source code. |
Amazon Web Services (AWS)
Support for lambda
Support for SQS
Links
Link Type | Function |
---|---|
callLink |
|
Code samples
This code will publish a message into the "SQS_QUEUE_URL" queue:
// Load the AWS SDK for Node.js var AWS = require('aws-sdk'); // Set the region AWS.config.update({region: 'REGION'}); // Create an SQS service object var sqs = new AWS.SQS({apiVersion: '2012-11-05'}); var params = { // Remove DelaySeconds parameter and value for FIFO queues DelaySeconds: 10, MessageAttributes: { "Title": { DataType: "String", StringValue: "The Whistler" }, "Author": { DataType: "String", StringValue: "John Grisham" }, "WeeksOn": { DataType: "Number", StringValue: "6" } }, MessageBody: "Information about current NY Times fiction bestseller for week of 12/11/2016.", // MessageDeduplicationId: "TheWhistler", // Required for FIFO queues // MessageGroupId: "Group1", // Required for FIFO queues QueueUrl: "SQS_QUEUE_URL" }; sqs.sendMessage(params, function(err, data) { if (err) { console.log("Error", err); } else { console.log("Success", data.MessageId); } });
This code will receive a message from the queue "SQS_QUEUE_URL":
// Load the AWS SDK for Node.js var AWS = require('aws-sdk'); // Set the region AWS.config.update({region: 'REGION'}); // Create an SQS service object var sqs = new AWS.SQS({apiVersion: '2012-11-05'}); var params = { // Remove DelaySeconds parameter and value for FIFO queues DelaySeconds: 10, MessageAttributes: { "Title": { DataType: "String", StringValue: "The Whistler" }, "Author": { DataType: "String", StringValue: "John Grisham" }, "WeeksOn": { DataType: "Number", StringValue: "6" } }, MessageBody: "Information about current NY Times fiction bestseller for week of 12/11/2016.", // MessageDeduplicationId: "TheWhistler", // Required for FIFO queues // MessageGroupId: "Group1", // Required for FIFO queues QueueUrl: "SQS_QUEUE_URL" }; sqs.receiveMessage(params, function(err, data) { if (err) { console.log("Error", err); } else { console.log("Success", data.MessageId); } });
What results can you expect?
Once the analysis/snapshot generation has completed, you can view the results in the normal manner (for example via CAST Enlighten):
Click to enlarge
When the evaluation of the queue name fails, a Node.js AWS SQS Unknown Publisher (or Receiver) will be created.
Support for AWS S3
Links
Link Type | Function |
---|---|
No Link |
|
callLink |
|
useInsertLink |
|
useDeleteLink |
|
useSelectLink |
|
useUpdateLink |
|
Code samples
This code will create a S3 Bucket named "MyBucket" on an AWS server in region "REGION" and puts an object in it
// Load the AWS SDK for Node.js var AWS = require('aws-sdk'); // Set the region AWS.config.update({region: 'REGION'}); // Create S3 service object s3 = new AWS.S3({apiVersion: '2006-03-01'}); // Create the parameters for calling createBucket var bucketParams = { Bucket : "MyBucket", ACL : 'public-read' }; // call S3 to create the bucket s3.createBucket(bucketParams, function(err, data) { if (err) { console.log("Error", err); } else { console.log("Success", data.Location); } }); params = { // ... Bucket: "MyBucket" }; s3.putObject(params, function(err, data) { if (err) console.log(err, err.stack); // an error occurred else console.log(data); // successful response });
What results can you expect?
Once the analysis/snapshot generation has completed, you can view the results in the normal manner (for example via CAST Enlighten):
Analysis of the code sample
Known limitations for AWS support
- The use of AWS.SQS with promises is not supported. For instance no link would be created between the receiver and the handler function defined in .then() call in the following source code:
sqs.receiveMessage(params).promise().then( () => {});
- The use of AWS.SQS with send() is not supported. For instance no link would be created between the receiver and the handler function defined in .send() call in the following source code:
var request = sqs.receiveMessage(params); request.send(() => {});
Support for SNS
The following APIs are supported:
For SDK V2
- AWS.SNS({apiVersion: '2010-03-31'}).publish(params)
- AWS.SNS({apiVersion: '2010-03-31'}).subscribe(params)
For the publish method a NodeJS AWS SNS Publisher object is created. Its name is that of the topic.
For the subscribe methods, a NodeJS AWS SNS Subscriber object is created. Its name is that of the topic. Then for each supported protocol, an object is created with a callLink from the subscriber to that object. The supported protocols are the following:
protocol | object created | name of the object |
---|---|---|
NodeJS Email | an Email (the email addresses are not evaluated) | |
sms | NodeJS SMS | an SMS (the SMS numbers are not evaluated) |
http/https | NodeJS AWS Post HttpRequers service | the url (evaluated from the endpoint) |
sqs | NodeJS AWS Simple Queue Service Publisher | the name of the queue (evaluated from the endpoint) |
lambda | NodeJS Call to AWS Lambda Function | the name of the lambda function (evaluated from the endpoint) |
the com.castsoftware.wbslinker will create a callLink between the SNS Publishers and SNS Subscribers which have the same name.
Example
When analyzing the following source code:
var AWS = require('aws-sdk'); // Set region AWS.config.update({region: 'REGION'}); // Create promise and SNS service object var sns = new AWS.SNS({apiVersion: '2010-03-31'}) function my_subscribe(params) { sns.subscribe(params, function (err, data) { if (err) console.log(err, err.stack); // an error occurred else console.log(data); // successful response }); } function my_publish(params) { sns.publish(params); } function foo() { let topicArn = "arn:aws:sns:eu-west-3:757025016730:testTopic"; my_subscribe({Protocol: "EMAIL", TopicArn: topicArn, Endpoint: "EMAIL_ADDRESS"}) my_subscribe({Protocol: "SMS", TopicArn: topicArn, Endpoint: "911"}) my_subscribe({Protocol: "LAMBDA", TopicArn: topicArn, Endpoint: "arn:aws:lambda:eu-west-3:757025016730:testLambda"}) my_subscribe({Protocol: "HTTP", TopicArn: topicArn, Endpoint: "http:/myapi.api.test.com"}) } function bar() { const params2 = { TopicArn: "arn:aws:sns:eu-west-3:757025016730:testTopic", Message: "MESSAGE_TEXT" }; my_publish(params2) }
Linking
The extension com.castsoftware.wbslinker is responsible for matching NodeJS Call to AWS Lambda Function objects to Lambda Function objects such as Java AWS Lambda Function during application-level analysis.
Support for AWS XRay
aws-xray encapsulates AWS methods calls in order to provide status and load status. However, the encapsulation did not allow the extension to provide objects and links. With the support of AWS XRay starting in 2.6.0-beta4, these objects and links will be created.
Code samples
This code will encapsulate AWS SDK then create a dynamoDB instance, and Document client instance.
// Copyright 2019 Amazon.com, Inc. or its affiliates. All Rights Reserved. // SPDX-License-Identifier: MIT-0 // default imports const AWSXRay = require('aws-xray-sdk-core') const AWS = AWSXRay.captureAWS(require('aws-sdk')) // Encapsulate AWS SDK const { metricScope, Unit } = require("aws-embedded-metrics") const DDB = new AWS.DynamoDB({ apiVersion: "2012-10-08" }) // use AWS as usual const { v1: uuidv1 } = require('uuid'); // environment variables const { TABLE_NAME, ENDPOINT_OVERRIDE, REGION } = process.env const options = { region: REGION } AWS.config.update({ region: REGION }) if (ENDPOINT_OVERRIDE !== "") { options.endpoint = ENDPOINT_OVERRIDE } const docClient = new AWS.DynamoDB.DocumentClient(options) // response helper const response = (statusCode, body, additionalHeaders) => ({ statusCode, body: JSON.stringify(body), headers: { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*', ...additionalHeaders }, }) function isValidRequest(context, event) { return (event.body !== null) } function getCognitoUsername(event){ let authHeader = event.requestContext.authorizer; if (authHeader !== null) { return authHeader.claims["cognito:username"]; } return null; } function addRecord(event) { let usernameField = { "cognito-username": getCognitoUsername(event) } // auto generated date fields let d = new Date() let dISO = d.toISOString() let auto_fields = { "id": uuidv1(), "creation_date": dISO, "lastupdate_date": dISO } //merge the json objects let item_body = {...usernameField, ...auto_fields, ...JSON.parse(event.body) } console.log(item_body); //final params to DynamoDB const params = { TableName: TABLE_NAME, Item: item_body } return docClient.put(params) } // Lambda Handler exports.addToDoItem = metricScope(metrics => async (event, context, callback) => { metrics.setNamespace('TodoApp') metrics.putDimensions({ Service: "addTodo" }) metrics.setProperty("RequestId", context.requestId) if (!isValidRequest(context, event)) { metrics.putMetric("Error", 1, Unit.Count) return response(400, { message: "Error: Invalid request" }) } try { let data = await addRecord(event).promise() metrics.putMetric("Success", 1, Unit.Count) return response(200, data) } catch (err) { metrics.putMetric("Error", 1, Unit.Count) return response(400, { message: err.message }) } } )
What results can you expect?
Once the analysis/snapshot generation has completed, you can view the results in the normal manner (for example via CAST Enlighten):
Click to enlarge
Analysis of the code sample
Support for AWS Embedded Metrics
aws-embedded-metrics encapsulates functions to provide metrics. However, the encapsulation did not allow the extension to provide links. With the support of AWS Embedded Metrics starting in 2.6.0-beta4, these links will be create.
Code samples
This code will encapsulate lambda handler "addToDoItem".
// Copyright 2019 Amazon.com, Inc. or its affiliates. All Rights Reserved. // SPDX-License-Identifier: MIT-0 // default imports const AWSXRay = require('aws-xray-sdk-core') const AWS = AWSXRay.captureAWS(require('aws-sdk')) // Encapsulate AWS SDK const { metricScope, Unit } = require("aws-embedded-metrics") const DDB = new AWS.DynamoDB({ apiVersion: "2012-10-08" }) // use AWS as usual const { v1: uuidv1 } = require('uuid'); // environment variables const { TABLE_NAME, ENDPOINT_OVERRIDE, REGION } = process.env const options = { region: REGION } AWS.config.update({ region: REGION }) if (ENDPOINT_OVERRIDE !== "") { options.endpoint = ENDPOINT_OVERRIDE } const docClient = new AWS.DynamoDB.DocumentClient(options) // response helper const response = (statusCode, body, additionalHeaders) => ({ statusCode, body: JSON.stringify(body), headers: { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*', ...additionalHeaders }, }) function isValidRequest(context, event) { return (event.body !== null) } function getCognitoUsername(event){ let authHeader = event.requestContext.authorizer; if (authHeader !== null) { return authHeader.claims["cognito:username"]; } return null; } function addRecord(event) { let usernameField = { "cognito-username": getCognitoUsername(event) } // auto generated date fields let d = new Date() let dISO = d.toISOString() let auto_fields = { "id": uuidv1(), "creation_date": dISO, "lastupdate_date": dISO } //merge the json objects let item_body = {...usernameField, ...auto_fields, ...JSON.parse(event.body) } console.log(item_body); //final params to DynamoDB const params = { TableName: TABLE_NAME, Item: item_body } return docClient.put(params) } // Lambda Handler exports.addToDoItem = metricScope(metrics => async (event, context, callback) => { metrics.setNamespace('TodoApp') metrics.putDimensions({ Service: "addTodo" }) metrics.setProperty("RequestId", context.requestId) if (!isValidRequest(context, event)) { metrics.putMetric("Error", 1, Unit.Count) return response(400, { message: "Error: Invalid request" }) } try { let data = await addRecord(event).promise() metrics.putMetric("Success", 1, Unit.Count) return response(200, data) } catch (err) { metrics.putMetric("Error", 1, Unit.Count) return response(400, { message: err.message }) } } )
What results can you expect?
Once the analysis/snapshot generation has completed, you can view the results in the normal manner (for example via CAST Enlighten)
Analysis of the code sample
Call to Program
NodeJS extension now supports call to external programs using the child-process module.
The fork() function is not handled as its only purpose is to fork node.js programs.
These declaration create a call to a java program/JAR file
const exec = require('child_process').exec; exec('java -cp com.castsoftware.Archive -jar jarFile.jar', (e, stdout, stderr) => { if (e instanceof Error) { console.error(e); throw e; } console.log('stdout ', stdout); console.log('stderr ', stderr); });
const cp = require('child_process'); const class_name = 'com.castsoftware.Foo' function call_foo(req, resp) { const args = [ '-cp', '/bin', class_name ]; const proc = cp.spawn('java', args); }
These declarations creates a call to a Python Program
const execFile = require('child_process').execFile; const python_file = 'app.py' const child = execFile('python', [python_file], (error, stdout, stderr) => { if (error) { console.error('stderr', stderr); throw error; } console.log('stdout', stdout); });
Restify
NodeJS extension now supports routing using the restify module. The following is an example of application using restify to handle some URIs.
var restify = require('restify'); function send(req, res, next) { res.send('hello ' + req.params.name); next(); } var server = restify.createServer(); server.post('/hello', function create(req, res, next) { res.send(201, Math.random().toString(36).substr(3, 8)); return next(); }); server.put('/hello', send); server.get('/hello/:name', function create(req, res, next) { res.send(201, Math.random().toString(36).substr(3, 8)); return next(); },send); server.head('/hello/:name', send); server.del('hello/:name', function rm(req, res, next) { res.send(204); return next(); }); server.listen(8080, function() { console.log('%s listening at %s', server.name, server.url); });
SQL Named Query
When executing an sql query directly, a CAST SQL NamedQuery object will be created:
var oracledb = require('oracledb'); connection = oracledb.getConnection( { user : "hr", password : "welcome", connectString : "localhost/XE" } ); oracledb.getConnection( { user : "hr", password : "welcome", connectString : "localhost/XE" }, function(err, connection) { if (err) { console.error(err); return; } connection.execute( "SELECT department_id, department_name " + "FROM titles " + "WHERE department_id < 70 " + "ORDER BY department_id", function(err, result) { if (err) { console.error(err); return; } console.log(result.rows); }); });
Structural Rules
The following structural rules are provided:
Known Limitations
In this section we list the most significant functional limitations that may affect the analysis of applications using Node.js:
- With regard to external links degraded mode, only statements with a FROM clause are correctly handled.
- NodeJS objects are only supported for ES5 standard.
- Analysis of AWS Lambda function needs have access to the serverless.yml file mapping routes and handlers together
- Known limitations for AWS support are detailed in this section.