Serverless in the cloud: AWS vs. Google Cloud vs. Microsoft Azure

With AWS Lambda, Google Cloud Functions, and Microsoft Azure Functions, a little bit of business logic can go a very long way

1 2 Page 2
Page 2 of 2

You don’t need to focus on just Firebase. The more basic Google Cloud Functions is a simpler approach to embedding customized code throughout the Google cloud. At this time, Cloud Functions is largely just an option for writing Node.js code that will run in a pre-configured Node environment. While the rest of the Google Cloud Platform supports a wide variety of languages—from Java and C# to Go, Python, and PHP—Cloud Functions is strictly limited to JavaScript and Node. There have been hints that other language options are coming and I wouldn’t be surprised if they appear soon.

Google Cloud Functions does not reach as deeply into the Google Cloud as AWS Lambda reaches into AWS, at least at this point. When I poked around looking at building a function to interact with Google Docs, I found that I would probably have to use the REST API and write the code in something called Apps Script. In other words, the Google Docs world has its own REST API that was serverless long before the buzzword was coined.

It’s worth noting that Google App Engine keeps going strong. In the beginning, it just offered to spin up Python applications to meet the demand of anyone coming to the website, but has been extended over the years to handle many different language runtimes. Once you bundle your code into an executable, the App Engine handles the process of starting up enough nodes to handle your traffic, scaling up or down as your users send in requests.

There continue to be a few hurdles to keep in mind. As with Cloud Functions, your code must be written in a relatively stateless way, and it must finish each request in a limited amount of time. But App Engine doesn’t toss away all the scaffolding or forget everything between requests. App Engine was a big part of the serverless revolution and it remains the most accessible to those who keep one foot back in the old school method of building their own stack in Python, PHP, Java, C#, or Go.

Microsoft Azure Functions

Microsoft, of course, is working just as hard as the others to make sure that people can do all of these clever serverless things with the Azure cloud too. The company has created its own basic functions for juggling events—the Azure Functions—and built some sophisticated tools that are even more accessible to semi-programmers.

The biggest advantage Microsoft may have may be its collection of Office applications, the former desktop executables that are slowly but surely migrating into the cloud. Indeed one accounting of cloud revenue put Microsoft ahead of Amazon, in part by lumping some of its Office revenue into the ephemeral rubric of “cloud.”

One of the best examples from the Azure Functions documentation shows how a cloud function can be triggered when someone saves a spreadsheet to OneDrive. Suddenly the little elves in the cloud come alive and do things to the spreadsheet. This is bound to be a godsend to IT shops supporting teams that love their Excel spreadsheets (or other Office docs). They can write Azure Functions to do practically anything. We often think that HTML and the web are the only interface to the cloud but there’s no reason why it can’t be through documents formats like Microsoft Word or Excel.

Azure’s Logic Apps caught my eye as one of the tools that lets you fill out forms instead of worrying about semantics and syntax. You still need to think like a programmer and make smart decisions about abstractions and data, but you might convince yourself that you’re not writing “code” as much as filling out forms.

microsoft azure functions ide IDG

Microsoft Azure’s web IDE lets you write your Azure function, run it, and debug it by inserting logging calls.

Like Amazon’s Step Functions, Logic Apps is meant to encode “workflows,” a buzzword that is slightly more complex than the average “function” that’s tossed around, thanks to the availability of some state. You can still write logic that links various functions and connectors in a flowchart-like way, but you don’t spell it out as much in an official computer language.

The big advantage of Logic Apps are the pre-built “connectors” that drill down into some of the bigger Microsoft and third-party apps out there. You can effectively push or pull data to and from your Logic Apps and the likes of Salesforce, Twitter, and Office 365. These connections will be very valuable to company IT folks who can now link together outside tools by writing Logic Apps just like they created shell scripts in the past.

Another intriguing corner of Azure is Azure Cosmos DB, a database that is both NoSQL and SQL at the same time. Microsoft has duplicated the APIs for Cassandra and MongoDB so you can push information in and out without rewriting your Cassandra or MongoDB code. Or if you want to write SQL, you can do that too. Cosmos DB keeps things straight and builds out indexes for everything to keep it running quickly. This makes it an intriguing central nexus if you’ve got lots of SQL and NoSQL code that you want make work together. Or maybe you just want to leave the door open to different approaches in the future.

Serverless cloud comparison

Which serverless platform is right for you? Writing basic functions is pretty much the same in all three silos, but there are differences. The most obvious may be the available languages because each plays favorites after they’re finished supporting Node.js and JavaScript. It’s not surprising that you can write C# for Microsoft’s Azure, but its support for F# and TypeScript is unique. Amazon embraces Java, C#, and Python. Google is strictly limited to JavaScript for its basic functions for now, although it supports many more languages in the App Engine.

The hardest part of comparing the serverless clouds is getting a handle on price and speed because so much more is hidden under the hood. I often felt like a crazy spender when I was spinning up VM instances that were priced in pennies per hour. Now the providers are slicing the salami so thinly that you can get hundreds of thousands of function invocations for less than a dollar. You’ll be tossing around the word “million” like Dr. Evil in the “Austin Powers” movies.

Of course, this apparent cheapness soon bamboozles the rational, budget-conscious part of our brain, just like when we’re on vacation in a strange country with wildly different denominations of currency. Soon you’ll be ordering up another million database calls, just like that time you bought the bar in Cancun a round of drinks because you couldn’t divide fast enough to figure out what it really cost.

When the cloud was selling you a raw virtual machine, you could guesstimate what you get from looking at the amount of RAM and CPU power, but in the world of serverless you’ve got no real clue what’s going on.

It’s worth noting that the serverless model pretty much forces you to stash data in the local cloud database because you’re not really allowed to keep any state with your code. You’ve got to trust these back ends. Your function must run without any local caches or configuration because other versions are always being created and destroyed. So the database glue code fills up your code like those vines in the Upside Down in “Stranger Things.”

The only real way to compare costs is to build out your app on all of the platforms, a daunting challenge. It’s possible to move some of the code between the three because they all run Node.js, but even then you’ll encounter differences that you just need to live with. (For instance, you handle HTTP requests directly in Microsoft and Google, but through the API Gateway in AWS.)

The good news is that you don’t need to be so paranoid. In my experiments, many basic apps use next to no resources and you can go a long way on the free layers all three offer to lure in poor developers. The serverless model really is saving us a bundle on the overhead. Unless you’re the type that ran your servers at close to full load all of the time and got free air conditioning, it’s likely you’ll end up saving some big money by moving to a serverless approach. You’ll be saving so much money that you won’t want to argue whether it’s $1 per million invocations or $1.50.

There is a deeper problem. If you ever get fed up enough with any of these clouds, you’re pretty much stuck. It’s not like it’s easy to just pull your code off and run it on a commodity server somewhere else, something you might do with a Docker container filled with your own code. If you’re lucky, you can duplicate the same raw architecture and basic JavaScript functions, but after that you’ll be rewriting database glue code all over the place. All three of the companies have their own proprietary data storage layers.

It’s also not clear just what happens when things go wrong. Running your own server means that, well, your boss can choke your neck when it doesn’t work. It’s not so clear what happens in this space. One page at Google contains this benign warning, “This is a beta release of Google Cloud Functions. This API might be changed in backward-incompatible ways and is not subject to any SLA or deprecation policy.”

Amazon’s terms of service have gotten better than they were when they first entered the space but they still include warnings to keep in mind like, “We may delete, upon 30 days’ notice to you and without liability of any kind, any of Your Content uploaded to AWS Lambda if it has not been run for more than three (3) months.” Make sure your code runs if you want to keep it around. Warnings like this are certainly fair (I know that my old Lambda functions won’t ever be used again), but it shows how you’re surrendering some control.

Microsoft offers a service level agreement for Azure services that promises financial compensation for downtime via service credits. Will these apply when your functions go down? Perhaps—as long as you don’t wander into some beta area of the service. It’s worth spending a bit of time paying attention to these details if you’re going to be building something more mission critical than a chat room for kids.

In most cases, the real comparison you’ll want to do is between the other features and services of the Amazon, Google, and Microsoft clouds, not the function layer. The ability to trigger Azure Functions with Office files on OneDrive will be a big attraction if you support people who love their Office applications. Google Firebase makes it easy to use functions to provide supporting services like messaging and authentication to web apps. AWS Lambda taps into so many Amazon services, it seems the sky really is the limit.

It is technically possible to mix and match all of these clouds and functions because they all speak the same PUT and GET language of HTTP API calls. There is no reason you couldn’t whip together an app filled with microservices that mix the best of the three clouds. But you’ll end up with greater latency as the packets leave the local clouds and travel the wilderness of the open Internet. And then there will be slight differences in parsing and structure that make it simpler to just sit in one company’s warm embrace.

So it probably makes sense to stay in the safe section of a single cloud, at least when it comes to apps that are fairly interconnected. Do you really like Google Maps and do you want to use them for your project? Then you might as well use Google Cloud Functions even if in your heart of hearts you rather use F# with Microsoft’s Azure Functions. The same goes for Amazon’s voice recognition or Google’s image analysis API or any of the dozens of different services and machine learning APIs. The functions aren’t so important—it’s what they link together that really matters.

1 2 Page 2
Page 2 of 2