This ends up being much more work than just writing a bit of code for AWS Lambda, IBM Cloud Functions, or the Microsoft Azure and Google Cloud equivalents. When all of the planets align in that world, the cloud’s data storage or API services offers all of the persistence and analysis you need. You really do just write a few simple functions to encode a bit of business logic and you’re done. The rest of the cloud is there to do the work.
All the persistence, API integrations, and other scaffolding the clouds provide pose a bit of a challenge for the open source projects. All of the projects do a good job of taking your function and getting it running. But they don’t handle everything else and that means you’ll have to duplicate all of these other features too. If you stomp your feet and run away from the major cloud providers, you’ll feel like the kid who runs away from home with a backpack filled with Pop Tarts and discovers that home is more than food. It’s a bed, a washing machine, a television, a bathroom, a dog, etc.
Cloud comforts
At this point, the open source projects are pretty good marketing for the clouds themselves. After unpacking these open source projects, I quickly started bumping into dozens of ways that life is harder outside of the comfortable, locked-in world of the cloud. This may change a bit as the platforms mature and the programmers push the meaning of the word “serverless.”
In the beginning, I think serverless was mainly meant to be used as a kind of shell script for the cloud, a way to add just a bit of logic to smooth the flow of data to and from the big services like IBM Cloudant or Amazon S3. Now people are embracing the idea for its simplicity and then turning around and writing incredibly complex functions inside the serverless framework. Some of the more macho programmers are taking fairly complex transcoding, machine learning, or computational jobs and shoehorning them into serverless calls that just barely finish under the time and memory limits.
The open source toolkits do a good job with this bigger and grander vision of serverless. If you’re going to be doing almost all of the work inside your one function, well, these tools will get you up and running in no time.
But here’s the rub. The reason serverless can be cheaper for many people is because you only pay for the server when you’re running your code. Until you have a large enough workload to keep one server running at full bore, it should be much cheaper to just run your function in a cloud that will help you share the hardware with a bunch of other functions. If your function generates only 10 percent of a load, then it can be cheaper to pay 10 percent of the inflated price of a server in the fancy, overpriced cloud than 100 percent of a server in your machine room when 90 percent of the server is wasted.
The economics of this are confusing and involve the cost of electricity, the cost of real estate, and the cost of hardware. If you’re clever, you can save bigly. These open source projects make it possible for you to consider duplicating the serverless magic in your own server farm but they also mean that you will have to navigate all of the economic and deployment complexity yourself. The good news is that they do a pretty good job of handling the work of turning a function into a running container. If only that were the only task.