It's not a question of if, but of when: Most of enterprise computing will eventually be sucked up into the public cloud, kind of like the rapture in slow motion.
This is not exactly a radical notion, but signs of a great skyward event keep multiplying. Last week, Salesforce sent up a flare with the announcement of Wave, its new cloud analytics platform. Although it's far from the first public cloud analytics play -- Birst along with such startups as Adatao, Platfora, Tidemark, and many others rolled out first -- the announcement of Wave is a seminal event.
Wave's introduction is important because it sets up shop atop repositories of existing customer data already stored in the public cloud by Salesforce customers. This is a sort of triple play:
- It overcomes one of the biggest obstacles to cloud-based analytics, which is moving data from on-premises to the public cloud
- Analysis of customer data happens to be the area where so-called big data analytics are reaping their most tangible rewards
- Wave should be able to mash up all that structured Salesforce data with semi-structured Web/mobile clickstream and social data, both of which are also native to the cloud
Add a fourth win for Wave if you like: Today, large-scale analytics is a classic example of the sort of batch job suited to the public cloud. You don't want to buy a bunch of servers that will lie fallow when you're not running a job; it's far better to rent that compute and storage from a cloud service provider.
In the long run, though, this is a legacy issue. By the time the wholesale shift to the public cloud is in full swing, analytics will be such an embedded part of everything, from supply chain optimization to predictive failure analysis, it will all be real time and you'll never turn it off.
Salesforce Wave is the latest sign of cloudward momentum. The real reason the public cloud will triumph is simple: The frequency of significant technology advances has accelerated to such a degree, enterprises can't be expected to keep up.
The best cloud service providers are architected from the ground up to remake themselves continuously and insulate customers from that change. Over time, as change accelerates further, only multitenanted cloud providers that have fully abstracted their services to customers will be able to take advantage of and quickly deploy the latest advances. You could argue that some large enterprises will want to "stay in the business" of IT at this level, but if they do, they will have effectively fashioned themselves into cloud service providers.
What will be left for enterprise IT to do? Silly question -- build applications, of course. That has always been IT's ultimate deliverable anyway. Already, public cloud PaaS offerings such as Microsoft Azure, Red Hat OpenShift, and Pivotal Cloud Foundry (the last now also provided by third parties such as CenturyLink) are luring developers who appreciate the benefits of automated dev, test, and deployment environments in the cloud. As more cloud APIs are exposed and integration among all sorts of clouds progresses, those PaaS environments will only get richer.
In a recent interview I did with Chris Drumgoole, chief operating officer of IT for GE, it was clear that applications were the linchpin of his cloud strategy -- which is based on a bold decision to migrate all of GE's IT to the public cloud over time. Like most IT leaders, Drumgoole understands the importance of enabling developers to deliver more and better applications that increase engagement with customers and partners, not to mention the company's own employees. But Drumgoole also sees applications as the lens through which GE saw that public cloud computing would ultimately cost less than the company maintaining its own IT infrastructure.
Cost and business dependency have provided two strong arguments against big shifts to the public cloud. Conventional wisdom goes like this: Sure, if you enlist the service of a cloud provider, you don't need to make the initial capital investment in hardware and software, but over time you'll pay as much or more. Plus, you're at the mercy of the cloud provider; if it jacks up rates or suddenly loses its ability to execute, you're out of luck.
Rather than focus on the comparative costs of systems or services, Drumgoole claims GE has a very deep view of the cost of applications, and when everything is taken into account, applications deployed in the public cloud cost less. Plus, to mitigate the business risks of depending on one public cloud, GE deliberately spreads itself across many providers, with a deep, near-real-time view into operating costs across all of them.
GE's perspective is highly advanced. Why shouldn't it be? The company is a leader in the Internet of things, with a keen eye for the inherent value in immense quantities of data that will stream from sensors embedded in a loads of industrial equipment manufactured by GE. The company is on the leading edge of our hyperconnected future.
The more connections between objects, the more the center of gravity rises to the cloud, because the data is already streaming across the Internet. In our data-intensive future, I can't imagine the overhead of endlessly syncing and securing countless locally maintained data stores. Most of the major cloud providers already have better security than you do -- and will improve those defenses faster than you can -- ultimately providing a safer home for data critical to your business.
As Salesforce reminded us last week, lots of important data is already "up there" in the cloud, which will spawn all kinds of new connected cloud applications. Granted, many obstacles to critical data ascending to the public cloud persist, not the least of which are government regulations -- and legacy systems that work perfectly well need not go anywhere. But heed the signs. Year by year, the rumblings of a mass skyward migration are growing louder.