How cloud-native technologies defeat cloud lock-in

Cloud-native applications are characterized by their ability to run in any cloud—and to be easily moved between clouds

How cloud-native technologies defeat cloud lock-in

Cloud-native computing is about how you build applications, not where you build them. That means a global enterprise can run cloud-native applications in its own data center just as well as in the public cloud. Kubernetes is one of the key underlying technologies for this model, which explains its meteoric rise over the past few years.

Kubernetes lets global IT teams build and run applications faster, by automating low-value operations tasks so that teams can focus on adding business value. Part of the core value-add of Kubernetes is its flexibility to run anywhere—i.e., in any data center or any cloud.

I read an article recently that describes the danger of having a cloud-native IT policy. The piece argues that “cloud-native means lock-in” and asserts that “you’re all in with a particular public cloud provider, the single provider of those cloud-native services, with the goal of making the most from your cloud computing investment.” This doesn’t align with my experience working with large enterprises that are deploying cloud-native technologies, which are typically open-source technologies (like Kubernetes). In fact, I believe adopting cloud-native practices is the single best way to avoid vendor lock-in.

This may simply be a case of a definitional misalignment rather than a structural disagreement. The Cloud Native Computing Foundation defines cloud native as “technologies [that] empower organizations to build and run scalable applications in modern, dynamic environments such as public, private, and hybrid clouds.” (See also the CNCF FAQ.) The ability to build applications that can be deployed across multiple cloud environments is core to the cloud-native proposition. In designing applications to run in any environment, you protect yourself from vendors who would use lock-in to raise prices and reduce service.

Cloud-native applications like the ones that run on Kubernetes are easy to run in multiple environments for several reasons:

  • Cloud-native applications are packaged in Linux containers that more easily run in multiple environments without modification than other packaging technologies like virtual machines.
  • All of the major cloud providers offer a Kubernetes service that enables Kubernetes-packaged apps to move with minimal to no modification, giving enterprises an easy migration path between clouds.
  • Thanks to community advances in open storage interfaces for Kubernetes, it is now possible to run data services directly on Kubernetes, making data as portable as containers themselves, thus removing an important source of lock-in.

A recent anecdote from a customer visit illustrates this. I was meeting with a senior member of the IT staff at a global bank that has invested heavily in the public cloud. While the vast majority of the company’s workloads run on a single cloud, the company recently paid to have 300 developers certified on a competing cloud and it is investing heavily to run its apps on Kubernetes—precisely because those apps will then be able to run across multiple clouds. “It’s like nuclear detente,” she told me. “If they know we can leave they are a better partner, and we get better pricing and service if we stay.”

This bank is practicing cloud-native in a level-headed and balanced way. The company understands the value of the public cloud, but it is building its applications in such a way that it can move them to other cloud providers if it needs to.

I think some in the industry equate “cloud-native” with “cloud-specific services” such as serverless technologies and managed data services. I agree that adopting serverless and managed data services can result in lock-in. Being locked into proprietary services and data formats prevents applications from being easily moved between clouds. But to the extent that enterprises are using cloud-native technologies like Kubernetes to make migrations easy, I see cloud-native as the best way to defeat cloud lock-in, not as a cause of it.

Murli Thirumale is co-founder and CEO of Portworx, provider of cloud-native storage and data management solutions for Kubernetes. 

New Tech Forum provides a venue to explore and discuss emerging enterprise technology in unprecedented depth and breadth. The selection is subjective, based on our pick of the technologies we believe to be important and of greatest interest to InfoWorld readers. InfoWorld does not accept marketing collateral for publication and reserves the right to edit all contributed content. Send all inquiries to

Copyright © 2019 IDG Communications, Inc.