The cloud computing market is about to get a lot more competitive

There is good reason to believe the future of computing looks nothing like the present

clouds spending money
geralt (CC0)

The near monopoly a few companies have on hosting the internet may appear to be irreversible, but powerful economic and technological forces are leading us towards an internet no longer consolidated around datacenters. Instead, the software services you rely on will be deployed on computers that can be anywhere in from your pocket to a sleeping server room in a Fortune 500 company.

Why should you care about diversifying where our computing power comes from?

The public cloud is primarily owned by four tech conglomerates. Amazon: 40 percent market share, No. 12 on Fortune 500. Google, Microsoft, and IBM: 23 percent market share combined, Nos. 27, 28, and 32 on Fortune 500.

Amazon owns nearly half the market, and the top four providers — all of whom are among the largest and most diversified companies on earth — own 63 percent. That’s 63 percent of a $195 billion industry, which is expected to more than double in three years — a large slice of an enormous pie.

The problem with cloud computing market consolidation is that it’s not a siloed industry; nearly every industry in the world, be it B2B or consumer-facing, relies on computing power in some way. It’s not an exaggeration to say cloud companies have the ability to cripple entire economies with the flip of a switch — be it accidental or malicious.

allan boyd fog blog tom gara tweet Allan Boyd

Already, repeated failures in a cluster of Amazon-run datacenters in Northern Virginia have led to widespread internet outages. You probably felt the effects of an early 2017 outage that affected everything from Adobe’s Cloud and GitHub to Medium itself, among many, many more. The website outage effects spanned inconvenience to financial loss (and inspired some quality Twitter commentary), but perhaps the most alarming effects were seen in IoT devices, which depend on cloud access to make decisions and perform action in the real world.

Nest security cameras lost several hours of footage, smart-bulb users spent hours waiting for lights to come back on, homeowners with connected gates were locked out, and a host of other connected devices became paperweights during the AWS outage. As the number of things connected to the internet skyrockets and starts to include self-driving cars and connected medical devices, it’s easy to see why cloud-only is not a suitable computing infrastructure.

Single points of failure are also single points of control

Single points of failure aren’t only worrying from a technical perspective, they also serve as single points of control. Compute is now a fundamental cost of doing business, through which Amazon and other cloud providers have managed to reach into nearly every industry in the world.

With only each other to compete with when it comes to pricing, cloud providers wield enormous influence over business’s bottom line. Even one’s proximity to a cloud datacenter can help or hinder entire business models: an organization trying to start up where “cloud” infrastructure is lacking may face debilitating latency issues that could hamstring them before they begin.

Especially for those in some of tech’s most exciting sectors, like internet of things (IoT), virtual reality, artificial intelligence, high-tech manufacturing, and health care, access to affordable processing power may be the single most important indicator of success. If centralized cloud computing continues to reign, a handful of cloud providers have the power to dictate tomorrow’s technological landscape.

Fortunately, that same unprecedented demand for low-latency, affordable processing power is fueling the movement towards a more distributed computing infrastructure.

Cloud deconsolidation has begun out of necessity

The IoT has already introduced a slew of web-connected appliances, but many more are expected to come. By 2020, an expected 30 billion connected devices will flood the market. Traffic from these devices will overwhelm datacenters owned by top cloud providers. Even the Earth itself might not sustain this growth; there may not be enough glass for the required fiber optic cable to get all the IoT data to the cloud for processing.

And while that sounds like a gloomy forecast, the computing power rental market is far from doomed. Rather, IoT’s requirement for low-latency processing power has forced us to find ways to leverage resource located outside the cloud, on computers that are closer to the devices that need it. It’s an infrastructure that Cisco Systems calls fog computing.

There are billions of computers owned by organizations and individuals across the globe perfectly capable of crunching IoT data, many of which sit idle much of the day. With software that enables them to run the same services you now run in the cloud, you can divide up workloads across participating computers in your offices, homes, and datacenters in the same way cloud providers divide work across the computers they own.

Fog computing offers a continuum of computing power spanning the distance between a device and the cloud, so instead of traveling thousands of miles to a centralized datacenter, a smart thermometer could find the processing power it needs on a participating computer in the next room. The same regional datacenter failure that downed security cameras and smart bulbs dependent on AWS would be inconsequential in a fog architecture, which has a near-infinite supply of failover nodes across the globe.

Of course, the sharing of computing resources is a utopian ideal without a means of incentivizing participation: just like Amazon charges software developers to run services on their computers in the cloud, for a fog computing infrastructure to work in the real world, computer owners must be compensated for the resources they contribute to the network.

Fog computing requires the commoditization of computing resources

I see the future of compute as a global marketplace, where anyone with available resource can rent out spare processing power for profit, and software developers can leverage a more competitive, geographically diverse network of compute.

With the burden (and profits) of powering the internet distributed across an infinitely broader spectrum of computers, downtime could be a thing of the past, and the unprecedented demand for processing power we’re experiencing becomes an earning opportunity for computer owners everywhere.

Cloud datacenters may not be going anywhere, but the rental market for computing resources is about to get a lot more diverse.

Related:

Copyright © 2018 IDG Communications, Inc.