In its mere five years of existence, Node.js has transformed from a technological curiosity to a technology stack all its own, providing a major building block for everything from microservices to APIs.
The better part of the rise is due to the ecosystem of tools, development environments, and hosting services that has evolved around Node.js in response to the need to make existing development tools Node.js-friendly (such as Visual Studio) and to provide Node.js with the kind of professional-level support and service it requires.
But tooling specific to the needs of Node.js apps gives a more granular view of the health of the Node.js ecosystem, showing both how far Node.js has come and how far it still has to go. High-profile shifts away from Node.js, including the recent introduction of a fork, magnify not only the limitations of the Node.js ecosystem but also the direction in which the ecosystem must evolve.
Here is a look at the future of Node.js as seen in developments emerging today.
Development environments: Uneven -- and looking beyond the traditional IDE
Hosting: Competition spurs support and innovation in the cloud
Before Node.js began making serious headway, the only way to run Node.js was to spin it up on bare metal that you owned. That time has long since passed, and cloud providers are now climbing over each other to provide Node.js hosting -- not only support for Node.js in VMs, but full-blown PaaS hosting for Node.js.
Another key change in the way Node.js works with hosts has come with the advent of Docker, the red-hot app containerization technology. Docker provides an easy way to bundle the Node.js runtime with its code, data, and any other associated applications, meaning any dependencies required by the application -- including the specific version of Node needed for it -- don’t have to be supported by the host. Docker also provides convenient ways to create Node.js apps and scale them (such as via the open source Deis PaaS). And an NPM package for Docker, dnt, allows Docker to be used to test code against multiple versions of Node.js in parallel.
Given these developments, Node hosting options will likely proliferate going forward. Here, containerization is key, as it makes it possible for developers to run Node.js on host services without the host even supporting the application’s Node runtime of choice. But as seen in Amazon’s Lambda, support for Node.js can also reap rewards for hosting providers looking to leverage the Node API to build new services and products.
Testing and debugging: The Node.js Achilles’ heel
Joyent, by way of the Joyent Private Cloud and Joyent Compute Service, has developed its own solutions to the issues with Node.js debugging. The most painful of those issues is memory consumption.
“Historically,” says Bryan Cantrill, chief technology officer at Joyent, “we have little insight into how memory is used in dynamic environments.”
To that end, Joyent included inside-out Node.js debugging support in its platform by leveraging the DTrace functionality of Joyent’s Solaris-derived SmartOS, on which the platform is built. The bad news is that anyone who wants to use the same toolset needs to run SmartOS, either on their own or via Joyent’s cloud. Cantrill didn’t rule out the possibility the debugging technology could be ported to other platforms, but admitted it has a lot of dependencies on SmartOS that would need to be resolved.
Testing frameworks represent another area for possible improvement. Bowery, creator of a cloud development-environment system, built the first version of its service in Node.js, but eventually switched to Go for a variety of reasons. Among them was the fact some testing frameworks for Node.js “worked better for front end, like Jasmine, and others were better for the backend, like Mocha.” With Go, they reasoned, testing is built-in and standardized across the board; Node.js could benefit from having a testing framework of similar robustness.
One fairly high-profile exit from the world of Node.js development was spurred by the existing limitations on debugging and developing Node.js applications. TJ Holowaychuk, creator of Koa, Express, and the Node.js-canvas project, penned an essay in June 2014 wherein he bid farewell, albeit fondly, to Node.js development and its environment that “favours performance over usability and robustness.” Debugging and error handling, especially for callbacks -- one of Node.js’s core behaviors --struck him as grossly underdeveloped.
To Holowaychuk, Node.js is a worthy and powerful project, but “performance means nothing if your application is frail, difficult to debug, refactor and develop.” Holowaychuk expressed confidence that these problems could be overcome in time.
A fork and the future
In theory, an open source project like Node.js should develop by leaps and bounds, with problems like the ones outlined here attacked in short order. In practice, though, parts of Node.js haven’t evolved as quickly as others. Aside from debugging and inspection, concerns about the pace of Node development have arisen, as evidenced in release cycle issues over the past year.
Joyent's plans to address this became clear only recently. Version 0.12 of Node was finally delivered in February 2015 with improvements in the above vein, and a separate Node.js Foundation is being set up to move governance to a disinterested third party.
Before that, however, others took their own steps. A fork of the Node.js project, named io.js, came into being as a way to address debugging, slow release cycles, and governance, among other issues.
“Debugging has always been challenging in an async environment,” said Rogers, “but we’re making headway.”
Io.js is still young, although a few existing Node.js deployments have begun using it (one engineer at Uber has apparently already done so at scale). But whether those advances and more like them come from Io.js or Node.js itself, it's clear that changes are badly needed for Node.js to flourish even more.
[This article was edited to clarify the use of Io.js by other companies.]