The 3 ways the Internet of things will unfold

The three key segments of the real IoT are on different paths, so don't think of them as one entity

Page 2 of 4

Machine-to-machine is simply about efficiency, not fundamental new opportunity
For decades, we've had industrial, medical, and office equipment that could talk to other equipment, such as thermostats that communicate temperature information to normalize HVAC settings, assembly-line sensors that let robots know to stop welding if the line is delayed or stopped, and EKG readers that alert a nursing station if worrisome readings occur. This is known as machine-to-machine (M2M) communications, and it's really useful.

These established M2M uses are now getting the IoT label, but they are not really changed by IoT. However, they're cheaper and easier to deploy because of greater technology standardization that is making the larger IoT trend possible. We'll thus see the "industrial Internet of things" (the new name for M2M) become more widespread, as smaller companies can afford to join in and larger companies can afford to bring the notion outside of expensive manufacturing systems.

It's like when PCs arrived in business: Suddenly, a computer didn't cost millions of dollars, so computing could go beyond the data center. Until recently, M2M technology was about at the level of business computing technology in 1980 in terms of cost and reach.

What's made M2M easier and cheaper to deploy? Bernie Anger, the general manager of General Electric's Intelligent Platforms division (a big M2M vendor for industrial automation) points to three factors.

  • ODBC User Agent adoption: This version of the venerable database connectivity protocol is not Windows-dependent, so devices on all sorts of platforms can now share data through a known protocol, not just PCs or devices running Windows Embedded. Due to the relatively low cost -- ODBC UA-capable devices with local computation ability and network access cost just $200 -- it's affordable to have more devices connected.
  • Hadoop and similar mass-scale data processing technologies allows analysis of massive data in cost-effective way. When analytics was an expensive, scarce resource, companies limited what data they collected and analyzed to the most critical areas. Now they can apply analytics to more areas, and they're doing so.
  • The ubiquity of the HTML5 Web standard in client devices: That means more than the use of iPads, smartphones, computers, and other off-the-shelf equipment -- it also means that specialty devices now use a client UI that's well understood and compatible with all the computing devices you have. The burden of writing to proprietary user interfaces is greatly reduced, and operator familiarity is greatly improved.

"None of these is a revolution, but they come together now to enable the scale and speed not possible a decade ago in the M2M/SOA worlds, when everything was essentially custom, nonstandard, and heavyweight," Anger notes.

Over time, the use of standard protocols and technologies will allow the "back end" M2M systems to interact with user-facing technologies, which will provide some white-knuckle moments for the guardians of the core systems while making them more valuable overall.

| 1 2 3 4 Page 2
From CIO: 8 Free Online Courses to Grow Your Tech Skills
View Comments
Join the discussion
Be the first to comment on this article. Our Commenting Policies