Watch out for serverless computing’s blind spot

Automatic provisioning of server resources is a great idea, but it can lead to the end of upfront resource planning—and high resource costs

 Watch out for serverless computing’s blind spot
Credit: Nimish Gogri

Serverless computing is an exciting aspect of public cloud computing: You no longer have to provision virtual servers in the cloud; that's done automatically to meet the exact needs of your application.

Although the value of serverless computing is not in dispute, it’s my job to find potential downsides in new technologies so that my clients—and you—can avoid them. In the case of serverless computing, we may find that cloud architecture as a discipline suffers. Here’s why.

When building applications for server-oriented architectures (where the virtual servers need to be provisioned, including storage and compute), you have built-in policies around the use of resources, including the virtual server itself. After all, you have to provision servers before the workloads can access them. That means you're well aware that they're there, that they cost money, and that they're configured for your workloads.

The serverless approach means you get what you need when you need it, which then exempts the cloud architect from critically thinking about resources that your applications will require. There’s no need for server sizing; as a result, budgets become a gray area because you’re basically in a world where resources are available from a function call.

The danger is that cloud architects, along with application designers and developers, become easily removed from the process of advanced resource planning. As a result, applications use more resources than they should, leading to much higher costs and poor application design practices.

In other words, you’ve put yourself in a position where you don’t know what’s happening and can’t optimize for the best outcome or calculate what you’re spending. You’ve made yourself blind because the system will take care of it.

How do you get the advantages of serverless computing without falling into this blindness trap? Application designers and cloud architects need to set up best practices and guidelines in terms of the use of serverless cloud resources.

Unfortunately, there is little in the form of methodologies for doing that and few tools available right now. But you have to do what you can:

  • The first step is to understand this blindness risk.
  • The next step is to continue to do real resource planning upfront, so serverless computing’s automation won’t have to handle wasteful tasks.