The cloud has been in the news a lot lately, and mostly for bad behavior. It’s been slow, expensive, insecure or simply MIA — taking major corporations offline for hours and raising questions about the future of cloud computing.
Is cloud computing going away? Absolutely not, but a rapidly emerging new technology may mean that we won’t be stuck with our cloudy blues for long.
Imagine enabling organizations to leverage the benefits of both cloud and on-site IT, with the speed, resiliency, bandwidth and scalability to run existing workloads — regardless of location — and to power new technologies such as the Internet of Things (IoT) and machine learning.
This new approach is called "fog computing," and it is poised to have a huge impact on how — and where — computing happens in the very near future.
What is fog?
Fog looks like a cloud, but it’s closer to you. Fog computing extends the concepts of the cloud out to the network "edge" to include computing devices that are physically closer to where you’re reading this. Unlike cloud computing, in which one uses a single massive data center to store and process data, Fog enables data to be stored, processed and acted upon more rapidly and efficiently by using computing resources that are closer to where data is produced (e.g., sensors or security cameras).
Fog does, however, include the cloud, which is used only for the tasks that cloud is best suited for, namely global coordination, longer-term data storage, and analytics that are not time-critical. In this way, fog computing combines the best properties of both cloud computing and on-site IT through a single holistic architecture, enabling systems and services to be deployed using a combination of task-appropriate computing resources.
By tying together on-site and cloud (and everything in-between), fog gives you the scalability of cloud without the risk of a cloud outage — and the performance of on-site IT, without the cost of building and running a large data center. If you’re putting sensors and actuators on big equipment in a factory, for example, you can’t risk losing control of your facility if your internet goes out or there is a cloud outage. However, you also want somewhere to inexpensively and conveniently collect and store this sensor data and then analyze and visualize it with a dashboard accessible from anywhere in the world. This is where the cloud comes in.
[Related: ment-->Forget about the cloud: imagine the hybrid multi-cloud fog]
Similarly, data from multiple independent facilities can be combined and analyzed at any time in the cloud. What is learned can then be applied downward to the computing devices in the individual factories. In this way the cloud is used in the broad collection and analysis of information and for the propagation (but not enforcement) of the rules that on-site computers use to make decisions.
When and why will this matter?
This matters now. The IoT revolution is coming in hot, and it’s going to need a new way of doing things. Similarly, in high-bandwidth applications — think super-high-definition security cameras across a city — uploading all of that data to the cloud for processing is simply untenable, and fog will be there, too.
We’re in a data-rich world, with more than 30 billion (billion!) sensors coming online by 2020. With this will come a new set of challenges and a new set of opportunities. Huge bandwidth and latency issues? Sure. But consider the laptops, workstations and even the smartphones that surround us in our daily lives. Can we not put these to use when we’re not using them?
This is the new reality that is approaching fast. It will rest on the shoulders of cloud computing, but we are squarely entering the era of fog. Stay tuned.