Edge computing 101: A CIO demystification guide

CIOs are now used to cloud computing, but here comes edge computing. Should they worry?

Every so often in the IT industry, a new buzzword crops up and everyone appears to rush to jump on the bandwagon, even if the buzzword turns out to be little more than a new twist on ideas and practices that may have been in use for decades.

The latest trend in this phenomenon is edge computing, which is being touted by some commentators as the Next Big Thing, and discussions have been raging over whether it will replace cloud computing and whether or not it represents a new multi-billion dollar market opportunity, along the lines of the internet of things (IoT).

A straightforward explanation of edge computing is that it simply involves some processing at the edge of the network, as opposed to consolidating all processing power at the core of the network. As with other buzzwords, an exact definition may be elusive, but a good starting point is to simply think of it as putting the processing power at the point of action, or alternatively, putting the processing power in the most appropriate place for the specific application.

“I would say that edge computing has probably two, three or four different definitions, depending upon whom you are talking to,” says Tony Lock, distinguished analyst at Freeform Dynamics.

“One of these is that you need to do some computing close to where the data is, for reasons of latency, or simply in terms of network bandwidth cost. That’s obviously an issue for lots of IoT examples where these days you could potentially be generating huge volumes of data,” he adds.

As a good example, consider the case of autonomous self-driving vehicles. The control system for such an application requires a huge amount of data to be constantly monitored regarding the road conditions ahead, often from multiple sensor inputs such as radar, laser scanners and computer vision, and calls for real-time processing of that data in order to safely navigate and avoid obstacles.

Sending all the sensor data back to the cloud for processing would be impractical in this case, not only because of the latency, but because of the sheer volume of sensor data that is involved. In this example, the processing needs to be handled right there in the vehicle, where the data is being generated.

This does not imply that data should not be sent back to the data centre at all; some applications call for data to be collated from multiple endpoints for monitoring and analysis purposes, especially in IoT deployments.

Edge computing does not replace cloud

Another more traditional example of edge computing is that of the branch office, where it makes sense to have some local servers and infrastructure to meet the needs of the employees that are based there, rather than having all applications and data served up over a WAN connection back to head office.

Perhaps the confusion over edge computing is because it appears to fly in the face of some prevailing notions, encouraged by cloud firms like AWS, that cloud computing will gradually subsume most IT functions. Under this notion, most or all processing will be consolidated into the networks of giant data centres operated by the cloud providers.

But while cloud computing continues to grow in popularity as a platform for deploying and consuming IT services, it is not going to replace all existing IT infrastructure any more than edge computing is going to replace the cloud. Both have their use cases and applications and services that will be best delivered using one model rather than the other, and in many cases, both edge computing and cloud will be required.

“Will edge computing kill the cloud? No. All it is doing is becoming another service delivery mechanism in the same way that cloud is. Cloud hasn’t killed internal computing either—it hasn’t replaced all of the internal siloes that run in isolation, firms still have a mix of private cloud, siloed systems and external systems, so it’s basically just another step in IT,” says Lock.

As proof of this, many edge computing deployments involve similar software and hardware configurations to that found in cloud data centres. The OpenStack Foundation sees edge computing as a growing use case for the open source data centre automation framework, for example, and the OpenStack Summit in May featured several presentations on the subject.

“I think the reality is that we are collecting so much data at the edge that we can’t send it all back to data centres, whether that’s private clouds or public clouds, and so this is a huge opportunity and a big use case that we’re going to see a lot more of,” said OpenStack Foundation executive director Jonathan Bryce, speaking at the event.

One such presentation was given by Verizon’s Cloud Networking product manager Beth Cohen, who discussed how the firm had used OpenStack as the basis for its Virtual Network Services (VNS) platform that can be deployed either in a data centre or at the customer’s site as part of a universal customer premises equipment (uCPE) appliance.

Continuing the telecoms theme, a further example of edge computing can be seen in the ongoing development of 5G networks. The demanding range of requirements that are now being drawn up for these next-generation mobile networks, including data rates of gigabits per second, low latency, and the ability to support a large number of simultaneously connected devices, mean that network base stations will need so much compute power that they will effectively become miniature data centres in their own right.

This does not mean that they will replace the existing data centres operated by the network operators, but that they will require a substantial amount of local processing power in order to meet service demands.

Even now, many telecoms providers are pushing out service delivery to the network edge, so that content such as the most popular on-demand movies is cached as close as possible to subscribers, again to tackle issues of latency and preventing such high bandwidth content from overloading the backhaul connection to the data centre.

This is not especially new: Akamai Technologies has been following a similar model with its global content delivery network for almost two decades.

Automation, security and monitoring

But while edge computing may not be new, it brings its own set of issues that need to be addressed. Chief among these is security, according to Lock, which is “possibly the single biggest overlooked factor at the moment.”

“So [telcos] might be caching more and more things locally, or doing more and more processing locally, so you’ve got caching elements and you’ve got processing elements at the edge of the network, and that brings in security as well, because if you’re caching data, how do you make sure it is properly secured and kept safe in that remote location?”

Another issue is management, especially in an application where the edge computing might involve hardware deployed at remote sites or other places where it may not be desirable to keep sending engineers to change or fix things. The key is automation, so that such systems can run themselves with as little human intervention as possible.

“As you get more and more of these edge systems, you’ve got to automate as much as you can, and that means you’ve got to have good policy control, you’ve got to have good visibility of what’s out there and what it’s doing,” says Lock.

“You need some way of monitoring for exceptions to see when things are going wrong—is data now suddenly being siphoned off somewhere you didn’t expect it to go? You need more automatic analysis to tell you where to point the human eyes, and that applies equally to the security side as well as to the actual intelligence and business use of the data you collect,” he adds.

Edge computing can also be found in the move towards industrial digitalisation, also known as Industry 4.0. This began in Germany, and is about bringing the benefits of modern IT and digital transformation to the manufacturing industry, converging IT and operational technology (OT) systems and using analytics to monitor and optimise the production line.

To monitor a factory full of production processes and perform real-time analytics calls for an ample amount of processing power. Consequently, some solutions that have been developed in response by vendors including HPE, Dell and Huawei are best described as micro data centres, fitting a rack full of servers cooling, and power supply into a single enclosure that can be deployed somewhere on site.

Overall, the message is that edge computing is not necessarily a new concept, just a new term for approaches to computing that have been around for a long time. It helps to remember that there are no one-size-fits-all solutions in IT, and that means organisations need to choose the most suitable architecture for the application, based on considerations such as latency, network bandwidth and cost. Sometimes it makes more sense for data to be processed in the cloud, and sometimes it makes more sense for that processing to be done on the spot.

This story, "Edge computing 101: A CIO demystification guide" was originally published by IDG Connect.

Copyright © 2017 IDG Communications, Inc.