5 reasons Node.js rules for complex integrations

With JavaScript, JSON, REST, NPM, and an ever-increasing supply of modules, Node.js should be your first choice for integration

5 reasons Node.js rules for complex integrations
Credit: Thinkstock

Because software solutions rarely operate in a vacuum, integration is a necessary fact of life for many developers. Sometimes it’s easy. Anyone who has integrated an application into Slack, for example, will have been treated to an incredibly smooth experience. In many cases it’s as simple as filling in a form (a URL or two, an authentication key) and hitting the Submit button. That’s plain awesome.

But then you have the complex integrations, the ones you find in the Fortune 5000, that might involve weaving together third-party products numbering in the double digits. These are the environments where the level of specific customization and the complexity of the business logic you can use are determined by the openness and comprehensiveness of each vendor’s API.

When the time comes to stitch together those APIs and communicate with the outside world, the process of integration moves well past configuration and starts looking a lot like code. This is where the external scripts tools and API work comes in. Historically, this has been done using DOS batch files, Linux shell scripts, Perl, Python, and so on.

However, there’s a new tool of choice for these scenarios: Node.js. Why should you use Node.js for linking third-party APIs, publishing external APIs, integrating private and public cloud environments, automating deployments, and other gluecode projects? In fact, there are five reasons why Node.js is exceptionally valuable for integrating complex and hybrid environments.

1. Node.js is a lingua franca

Read any article touting the advantages of Node.js, and the ubiquity of JavaScript (and JavaScript skills) will be high on the list. JavaScript is accessible and fairly intuitive, and it accommodates many writing styles, and Node.js has successfully adapted this formula to server-side development. The broad adoption of JavaScript for UI development and Node.js for server-side development means the number of developers who will have a passing knowledge of the language is enormous.

Further, as this relates specifically to integration, vendors themselves are increasingly using JavaScript within their products. Here at Moogsoft, the Moogsoft AIOps platform exposes “bots” to allow users and implementers to enrich and extend the capabilities of the product, and those bots are implemented using a JavaScript engine. This decision was due directly to the ubiquity of JavaScript, as we want our customers to be self-sufficient, and allowing them to leverage internal JavaScript skills helps achieve that goal.

While we’re on the subject of language, the fact that Node.js compiles at runtime means the code remains human readable, a key reason scripting languages are such a boon for integrators. Node.js offers extremely good performance to boot.

2. Node.js has modules, lots of modules!

An enormous part of the popularity of Node.js is the vast community of contributors, enabled by the externalization of reusable code as modules, which can then be incorporated using a require() statement. The simplest are in the format:

var myName = require(‘external-module’);

The number of published modules is staggering. If you’re trying to write code to interface with a popular third-party application, chances are someone has already done it and published a module and documentation that enable you to achieve your objective in a few lines of code. MySQL and MongoDB are good examples.

This is incredibly important; after all, the goal is to have an effective system up and running with the minimum of additional code (no matter how much fun coding in Node.js can be).

Vendors too are getting in on the game. It’s now very common to see wrappers and clients for tools and applications available as Node modules. Twitter’s client “SDK” is an excellent example.

Here’s an example of a module we publish to provide an easy way for users to build and send events to our Moogsoft AIOps system:

 var MoogEvent =require(‘node-moog’).MoogEvent;
 myEvent =newMoogEvent();
 myEvent.description=’My new description’;

Node modules are a fantastic way for vendors to expose their APIs and functionality, and to have them incorporated natively in Node.js code.

3. JSON is native

JSON, JavaScript Object Notation, is a lightweight data-interchange format that has numerous advantages for integration. It’s simple and so very easy to learn, but it can be adapted to almost any use case.

JSON is human readable, which is useful both to the humans who are trying to deploy expeditiously and to those who maintain the result. Even when you’re trying to debug the condensed form of JSON, you can draw on a variety of online tools exist that make the work easier. My favorite is JSONLint.

JSON is far easier to interpret than XML and gradually edging out XML as the de facto data interchange format for the web. Cloud-first solution providers almost universally employ JSON as the default payload.

Best of all, JSON is native to Node.js. All JavaScript values—except primitives—are objects, and complex or hierarchical objects in JavaScript are described in JSON. This means there are very few steps required to process and handle a JSON payload.

An example payload:

{“title”:”example”,”contents”:”some stuff”}

Can be handled thusly:

var message = JSON.parse(payload);
console.log(message.title);  // Will output “example”

This ability to handle data interchange between external systems in the same way you would handle data within the code dramatically speeds integration efforts.

4. REST is native too

OK, so REST is not native, per se, but for all intents and purposes it might as well be.

Node.js has native support for HTTP/HTTPS, so it’s simple to do a GET or a POST to a RESTful endpoint. Even if a vendor doesn’t offer a Node.js module to do the job for you, it might at least offer sample code you can cut and paste.

In the worst-case scenario, if the vendor is being stalwartly “language agnostic,” it will almost certainly offer you an example using Curl. With a little practice it’s easy to figure out how to transfer the Curl arguments into a Node.js HTTP or HTTPS request.

Of course, you can turn to a variety of Node modules that will provide fully featured REST connections while hiding the more complex workings. (Node.js’ HTTP/HTTPS API is actually very low-level in order to ensure there are no functional limitations.)

Why is REST so important? In the same way JSON is becoming the de facto data-interchange format, RESTful web services are rapidly becoming the de facto web-friendly protocol—so much so, that for many vendors, REST has become a synonym for API.

At Moogsoft, our bots have built-in REST capability, so integration with other applications and web services that offer REST endpoints is a breeze. We’ve also implemented a RESTful server, so external applications can interact with the system.

Speaking of which, Node.js’ HTTP/HTTPS module also offers server capabilities, so a Node.js application can listen for, and respond to, REST methods.

If you want to take advantage of an application’s outgoing REST support and offer a complex and rich web service, it’s worth looking at the Express Node module, which makes writing web servers quick and easy. The Express framework powers many of the internet’s most significant websites.

REST and its machine-data cousin WebHook are not only great for building intersystem APIs, but also for creating commands and tools. Look to them for chatops integration as well.

5. Packaging

I mentioned how modules enrich Node.js and support a thriving developer community. Another key feature that makes Node.js so compelling for the systems integrator is how easy it is to publish and access the modules.

Thanks to Node.js’ built-in package manager NPM, distributing and accessing Node modules is supremely easy. A contributor creates a package.json file containing details and dependencies, then simply “pushes” the module to the NPM public repository, where it becomes immediately available.

Downloading the module is equally simple; you can access ours from the command line:

$ npm install node-moog

As an old Unix guy, I’m very comfortable with the command line; in fact it’s my default. Node.js is ideal for creating command-line tools too. Arguments are easy to process (much like shell scripts and batch files) and NPM takes care of installation.

If you’ve created a CLI tool called myTool, for example, enter the following:

$ npm install –g myTool

This command will install myTool globally, and it will be immediately available from the command line. Thus, Node.js is particularly useful for creating tools like sandbox wrappers for chatops commands and for creating scripts for HA scenarios, archiving, reporting, and so on.

For the five reasons I’ve outlined, Node.js should be the go-to choice for tool-chain integration. Node.js is the ideal middleware for complex API integrations, and it keeps getting better. With every iteration of Google’s V8 JavaScript engine, the performance of Node.js improves. While I have tremendous respect for the amount of Python out there, Node.js is now our first choice for new code. Shouldn’t it be your first choice as well?

New Tech Forum provides a venue to explore and discuss emerging enterprise technology in unprecedented depth and breadth. The selection is subjective, based on our pick of the technologies we believe to be important and of greatest interest to InfoWorld readers. InfoWorld does not accept marketing collateral for publication and reserves the right to edit all contributed content. Send all inquiries to newtechforum@infoworld.com.