7 cutting-edge programming experiments worth trying

Erlang, Node.js, Go: Here's how to get started with the hottest programming trends without getting burned

Page 2 of 4

Cutting-edge experiment No. 2: The Node.js Web stack

Many enterprise services are judged on how quickly they deliver data. No one wants to keep a potential customer hanging around watching some blank window on a browser. Even fewer people want to keep their bosses waiting for some crucial report or analysis of the business.

Some cutting-edge tools are designed for speed. Node.js, for instance, is popular because it runs very quickly. It can be even faster when paired with one of the newer NoSQL databases, which offer incredibly fast results on saving data. Together, it's possible to build a fast Web infrastructure on a small platform that, as a side effect, also consumes much less electricity. Speed and energy efficiency are often linked.

The speed should be attractive to companies looking to put a priority on responding quickly. Some of the more ephemeral websites never want to disappoint or let a potential user wait very long. Companies with captive clients -- say, banks -- may want to make different decisions.

Node.js is an open source stack built on top of  the Chrome V8 JavaScript engine, but most people will begin with a prebuilt executable from nodejs.org for all of the major platforms. Joyent, the major sponsor, also offers cloud machines with images, including all of the necessary libraries and tools.

Many developers head straight for Web frameworks like Tower, Geddy, or Railway, each of which simplifies the work of building a basic, data-driven website.

The trouble with Node.js is not with performance, but with the weight put on the programmer's shoulders. Smarter programmers need to be more careful because the entire package runs in one process. If one user tosses a curve ball that hits a bug in your code, the entire Web server could lock up. Good programmers and extensive testing can avoid this, but no one is perfect all of the time. It is a polar opposite of Erlang because it offers few limits to keep programs from going awry.

The Node.js-NoSQL combination exposes a driving force for today's cutting edge: a focus on supporting the explosion of interest in social networks. If you're thinking of experimenting, find a place where you can afford to be fast but not careful. If your data needs careful curating, you might want to avoid these dangers.

Cutting-edge experiment No. 3: HTML5 Web and mobile apps

The new broom sweeps clean, the old saying goes, and so do new tools. The latest languages and software stacks that are built from scratch are not larded up with multiple revisions and deprecated APIs. The syntax and format are simple and uncluttered.

This usually produces cleaner, simpler code. While programmers can write convoluted code in any language, the newer stacks often require less extra glue code and version testing. Some of my code for smartphone apps goes through dozens of version tests to make sure it's doing the right thing for the right version. New stacks don't have this extra complexity.

There are dozens of new HTML5 projects that handle many of the basic details of creating a website or a mobile phone app. The code, which is often called a framework or a scaffolding, organizes the content in pages and offers a transition mechanism ruled by menus. Some of the most popular are jQuery Mobile, Sencha Touch, and Titanium, but a number of other tools are emerging. Many of the most popular CMS stacks like WordPress or Drupal sport themes that are tuned to the mobile environment and often use some of the same code.

While these new code stacks are clean, they often achieve this by tossing aside old platforms. It's easy for new tools to let people write simple, elegant code. They just ignore the older hardware and the older versions of the operating systems. Voilà! Of course they're simpler and faster because they only work with the pre-release code shipping at this moment.

The glitches with the HTML5 frameworks start appearing if you use an older browser or one that's not as standards-compliant. Suddenly, the menus start appearing in weird places and half of the text is off because the CSS instructions don't work. Sometimes the new needs to get along with the old, and it's a problem when the new code insists that it can only solve things one way.

Before you launch an experiment in this area, know where you can afford to support a subset of technologies out there.

Cutting-edge experiment No. 4: Chewing up data with R

From cleaner Web design to more sophisticated analysis of big data, the R language lies at the core of some of the most popular new tools designed to use math to solve problems and take care of customers. The collection of tools around R is more than just a language with predefined functions for common statistical formulae; they're entirely new ways of thinking about the problem and finding a solution.

The statistical models inside big data analysis packages, for instance, can suss out and flag complex patterns and take advantage of all the power a modern cluster of computers can deliver. They replace the old mechanisms that would simply sort or look for maxima. Working with cutting-edge statistical software means you can do deeper analysis and find signals when the old code just saw noise.

When these new insights appear, they can save businesses billions of dollars. They help stores detect local tastes and ensure that the shelves are better stocked with the colors, patterns, and sizes that are demanded by the people in the neighborhood. They offer marketing engineers the opportunity to do a better job at guessing how much advertising is enough. Anywhere there's data, there's a chance to find siginificant insights.

R, the language, is distributed through an open source project devoted to nurturing the core. Many developers start with more complete IDEs like R Studio that bundle together editors and output windows with the execution engine. The IDE is the best way to create code that can run on just the core when it's deployed into production stacks.

The trouble with statistical tools like R is that the insights don't always come, and what comes of the experimentation isn't always significant. Just because the thinking is newer doesn't make it better. Big data offers perfectly good theories and even great ideas, but few know just how good they are -- especially in context. Will this kind of statistical analysis really help your product? Will the incoming data have enough precision to allow the theory to work? No one knows, but you might find out if you devote several months of experimentation.

Consider the excitement about using statistical tools like R to slice through the mounds of data piling up in your disk farms. Perhaps you're the lucky one who has data filled with one very strong signal just waiting to be discovered. Most folks find that data mining requires plenty of human intelligence to discover the crucial insights that are buried in the noise. A quick dive into the numbers just yields confusion.

| 1 2 3 4 Page 2
From CIO: 8 Free Online Courses to Grow Your Tech Skills
View Comments
Join the discussion
Be the first to comment on this article. Our Commenting Policies