As part of the normal cycle of things, our most recent boom in enterprise technology development has slowed, which always leaves the industry breathless about whatever’s left that’s actually new. Witness, for example, the current mania over AI and machine learning.
I’ve had my fill of AI-washing, so the most interesting new area to me today is serverless computing, which hit the radar a couple of years ago when Amazon introduced AWS Lambda. The basic idea is that, finally, developers can build without worrying about physical or virtual servers or even containers. Instead, devs can simply assemble services from small building blocks of code called functions, and all that messy infrastructure stuff under the hood takes care of itself.
Because servers are concealed from devs rather than eliminated (which could only happen in a different universe), many prefer the term "FaaS" (functions as a service) rather than severless computing. That’s reflected in the nomenclature adopted by the AWS Lambda knockoffs now offered by the major competing clouds: Google Cloud Functions and Microsoft Azure Functions. (I’m not sure how IBM got the name for its version, OpenWhisk—a reference to whipping up applications, I guess?)
Last week saw the Serverlessconf event in Austin, where Peter Johnson, technical solutions architect at Cisco, was one of the attendees. “There’s a lot of excitement here,” he told me. “It reminds me of cloud in 2009.” According to Johnson, the main attraction to serverless computing is the following:
It’s a different way to think about your software architecture, in a way that lets you break your components down into smaller and smaller pieces. We used to think of the atomic unit as a VM—or with the microservices revolution going on right now, as something that runs in a container. This is taking that to the next logical conclusion to get even smaller. It used to be if you wanted a unit of compute it took you months to order bare metal hardware. Then, you could get VMs in minutes. Then, you could get containers in seconds. Now, you can get functions in milliseconds.
One of the beauties of this architecture is that you get charged by the cloud provider only when a service runs. You don’t need to pay for idle capacity—or even think about capacity. Basically, the runtime sits idle waiting for an event to occur, whereupon the appropriate function gets swapped into the runtime and executes. So you can build out a big, complex application without incurring charges for anything until execution occurs.
Another Serverlessconf attendee I spoke with was Nate Taggart, CEO of a startup called Stackery, which provides tools to manage all of the functions that comprise serverless applications so that devs can ship them to the infrastructure provider with all the dependencies packaged up. “I think any developer who plays with serverless realizes, ‘This is going to be big,’” he told me. “It returns software development to development, and not maintenance and management.”
Stackery is part of a growing serverless computing ecosystem. Although Stackery is platform-agnostic, others target the undisputed leader AWS Lambda exclusively. The startup Serverless, for example, offers a framework for building apps on that platform, while IOpipe has a metrics and monitoring service that provides insight into Lambda functions.
Although serverless computing appears tied to the public cloud—with huge potential for lock-in—a number of open source frameworks have already emerged. The most interesting of these is Platform 9’s Fission project, which is built on Kubernetes. Platform 9 has gone a long way toward making Kubernetes deployable by ordinary humans by making it a SaaS-managed solution. With Fission on top, I wouldn’t be surprised if Platform 9 gets greater notice as private cloud player.
I also find it intriguing that, alone among the public cloud providers, IBM has broken out its serverless computing platform as an open source project. Cisco’s Peter Johnson has downloaded and experimented with Apache OpenWhisk and found it impressive.
Keep in mind, though, that these are still the very early days. According to Stackery’s Nate Taggart, developers, with rare exception, are not yet using serverless computing platforms to develop full-blown applications. “Today, serverless solves some specific challenges,” he says. “The glue code, the bits that hold everything together—that’s what we’re seeing serverless used for today.”
“It’s awfully early,” agrees Zorawar Biri Singh, former head of HP’s cloud operation and most recently Cisco’s CTO, who has recently done a deep dive into the emerging serverless market. “But there’s a huge amount of potential. If I fast-forward and look at the world five years from now, applications built on serverless architecture will have massive advantages over the conventional SaaS apps of today—their cost of development and their agility and their ability to drive costs down is going to be super appealing.”
That’s a valuable business perspective, but Johnson really brings the allure for developers to life. “Agile software development is about getting more at-bats,” he says. “It’s about how quickly you can do the cycle, because we know that a lot of our ideas are going to be bad. What we want to do is filter out the good ones from the bad ones more quickly. What serverless is really about is putting together architectures that let us get more at-bats.”