Microsoft’s adoption of serverless computing is a big piece of Azure maturing as a platform. There’s a lot going on here, as architectures and services evolve to take advantage of the unique capabilities of the cloud and we as users and developers migrate away from traditional server architectures.
Mark Russinovich, Microsoft’s CTO of Azure, has a distinct view on the evolution of cloud as a platform. “Infrastructure as a service [IaaS] is table stakes,” he said at an Azure Serverless computing event at Microsoft’s Redmond, Wash., headquarters last week, “Platform as a service [PaaS] is the next step, offering runtimes and developing on them, an API and an endpoint, where you consume services.” That’s where we are today, where we still define the resources we use when we build cloud applications.
Then comes serverless computing. “Serverless is the next generation of computing, the point of maximum value,” Russinovich said.
What he’s talking about is abstracting applications from the underlying servers, where code is event-driven and scales on demand, charged by the operation rather than by the resources used. As he said, “I don’t have to worry about the servers. The platform gives me the resources as I need them.” That’s the real definition of serverless computing: The servers and OS are still there, but as a user and a developer you don’t need to care about them.
Serverless computing is the next phase of virtualization
You can look at it as a logical evolution of virtualization. As the public cloud has matured, it’s gone from one relatively simple type of virtual machine and one specific type of underlying hardware to specialized servers that can support IaaS implementations for all kinds of use cases, such as high-performance computing servers with massive GPUs for parallel processing and for scientific computing working with numerical methods, or such as arrays of hundreds of tiny servers powering massive web presences.
That same underlying flexibility powers the current generation of PaaS, where applications and code run independently of the underlying hardware while still requiring you to know what the underlying servers can do. To get the most out of PaaS (that is, to get the right fit for your code), you still need to choose servers and storage.
With serverless computing, you can go a step further, concentrating on only the code you’re running, knowing that it’s ephemeral and you’re using it to process and route data from one source to another application. Microsoft’s serverless implementations have an explicit lifespan, so you don’t rely on them being persistent, only on them being there when you need them. If you try to use a specific instance outside that limited life, you get an error message because the application and its hosting container will be gone.
Three serverless computing models
Nir Mashkowski, principal group manager for Azure App Service, noted three usage patterns for Azure’s serverless offerings.
The first, and most common, pattern is what he calls “brownfield” implementations. They are put together by enterprises as part of an overall cloud application strategy, using Azure Functions and Logic Apps as an integration tool, linking old apps and new and on-premises systems and cloud.
The second pattern is greenfield implementations, which are typically the province of startups, using Azure Functions as part of a back-end platform—that is, as switches and routers moving data from one part of an application to another.
The third pattern is for internet of things applications. It is a combination of the two, using Azure Functions to handle signals from devices, triggering actions in response to specific inputs.
For enterprises wanting a quick on-ramp to serverless computing, Azure Functions’ closely related sibling Logic Apps is an intriguing alternative. Drawing on the same low-code foundations as the more business-focused Flow, it gives you a visual designer with support for conditional expressions and loops. (You can even can run the designer inside Visual Studio.)
Like Azure Functions, Logic Apps is event-triggered and can be used to coordinate a sequence of Azure functions. Wrapping serverless code in a workflow adds more control, especially if it’s used to apply conditions to a trigger—for example, launching one function if a trigger is at the low end of a range of values, another if it’s at the high end.
In the cloud and on-premises: Portable serverless computing
Russinovich described three organizations working with serverless computing:
- Accuweather uses it to handle its server logs, replicating them between datacenters and handing them off to analysis tools.
- Similarly, Plexure, a marketing company, uses it to handle feeds from point-of-sale systems, replacing a complex stack of tools with a workflow that drives information from one service to the next.
- At the other end of the scale, the Missing Children Society of Canada used Logic Apps to build a bot that could bring research about missing kids together from various sources, including social media, in a project that took a mere four days to deliver.
One of the more interesting aspects of both Azure Functions and Logic Apps is that they’re not limited to running purely in the cloud. Functions themselves can be developed and tested locally, with full support in Visual Studio, and both Azure Functions and Logic Apps will be supported by on-premises Azure Stack hybrid cloud systems.
Inside the Azure datacenters, its serverless options are all containerized for rapid deployment. That same model will come to your own servers, with Azure Functions able to run on any server, taking advantage of containers for rapid deployment.
Currently, Azure Functions is based on the full .Net Framework release, so there’s a minimum requirement of Windows Server Core as a host. But that’s going to change over the next few months with an open source release based on .Net Core and the upcoming .Net Standard 2.0 libraries. With those in hand, you’ll be able to run Azure Functions in containers based on Windows Server Nano, as well as on .Net Core running on Linux. You’ll be able to migrate code from on-premises to hybrid cloud and to the public cloud depending on the workload and on the billing model you choose.
Such a cross-platform serverless solution that runs locally and in the cloud starts looking very interesting, giving you the tools to build and test on-premises,then scale up to running on Azure (or even on Linux servers running on Amazon Web Services).
There’s a lot to be said for portability, and by working with REST and JSON as generic input and output bindings, Microsoft’s containerized serverless implementation appears to avoid the cloud lock-in of its AWS and Google competitors while still giving you direct links to Azure services.