Best Practices for Serverless

In order to get the most out of serverless architecture, it is important to select the right serverless platform and adhere to best practices.

James Christopher - Product Marketing Manager
Rachel Kempf - Editor-in-Chief
Best Practices for Serverless

In the last post in this series, we compared serverless and containers for modern applications, discussing many of the advantages of working with serverless. However, serverless is not without its challenges. In order to get the most out of serverless architecture, it is important to select the right serverless platform and adhere to best practices. This post will discuss some of the key use cases for serverless, best practices for developers, and Azion’s approach to serverless computing using Edge Functions.

Serverless traits and use cases

With a low barrier to entry, automatic scaling, pay-as-you-go pricing, efficient resource use, and managed backend services, serverless provides an opportunity for new business models and opportunities. However, the properties of serverless functions make them particularly well-suited for small, stateless tasks. In addition, the elasticity of serverless functions provides a way to efficiently scale infrequent or irregular demand, and their low barrier to entry and fast deployment makes them ideal for businesses with rapidly changing requirements and a need for accelerated development. A whitepaper by CNCF, the Cloud Native Computing Foundation, provides several examples of use cases that fit these conditions, such as:

  • Batch jobs or scheduled tasks
  • REST APIs
  • Business logic
  • Continuous integration pipelines

This is echoed in a 2020 survey by Jeremy Daly, where developers listed all of the above among their most frequent use cases, along with:

  • Single-page applications
  • DevOps tasks
  • 3rd party service integration

Best practices for developers using serverless

Choose small, ephemeral tasks

One of the most frequently cited best practices is to create functions that are well-suited for serverless deployment. Functions with a lot of dependencies take longer to download and instantiate, resulting in higher latency. In addition, keeping tasks small and their execution time short will avoid runaway costs from mounting as functions scale. As Forrester notes in a 2019 report, “When using FaaS-based infrastructure, developers pay for invocation, duration, and memory. While invocation costs are fixed, duration and memory vary, and as they increase, they tend to dwarf the function invocation cost. By designing small services that focus on one task, developers can keep their costs in line.”

Design for auto-scaling

Another way to control cost and maintain performance is to keep autoscaling in mind at the design stage. Choosing tasks with few interdependencies makes it easier for them to execute in parallel, while using asynchronous design can deliver reduced costs over the model-view-controller method. As Forrester notes, “Chaining together functions is far more cost-efficient than implementing controllers as a function waiting for states to change or events to process.” At Azion, we have made it easy to program asynchronous functions through the use of async/await with promises.

Leverage third-party services

Third-party services and serverless are a natural fit, since API calls are how serverless functions communicate with each other. As a result, developers can easily connect functions to third-party services and outsource tasks that aren’t unique to their mission. This reduces custom code, which can help companies get to market faster. Azion’s newly launched Azion Marketplace makes it easy for developers to find and deploy third-party services with just a few clicks.

Secure your functions

One of the advantages of serverless is the shared responsibility model of securing applications. With serverless applications, the serverless provider handles backend security, dramatically reducing the attack vectors that need to be secured by the developer. However, for too many developers, this constitutes all or most of their security posture. When creating serverless applications, it’s absolutely essential that developers provide client-side security such as user data, authentication, and monitoring usage. In a 2020 post on serverless security, Jeremy Daly provides a number of best practices for securing serverless applications, such as:

  • applying least-privilege permissions
  • evaluating and securing third-party packages
  • encrypting sensitive data (and preventing its exposure through logs and alerts)
  • implementing a system for logging and monitoring
  • applying best practices to secure access keys and login credentials
  • guarding against SQL, XXS, and DDoS attacks

Choosing a serverless platform

Meeting serverless challenges

In spite of the advantages of using serverless, Daly’s 2020 serverless survey also listed key pain points among developers. These pain points are often cited as challenges of serverless development, and include:

  • Debugging
  • Monitoring
  • Performance
  • Vendor lock-in
  • Security

While some of these issues can be addressed through the best practices stated above, mitigating others may require innovation on the part of the serverless provider. As such, knowing providers’ approach to these challenges is key to a successful implementation of serverless functions.

Azion approach to serverless computing

At Azion, we’ve developed our serverless compute service, Edge Functions, specifically to avoid the pain points of serverless customers. As such, we’d like to share our unique approach to debugging, monitoring, performance, security, and vendor lock-in.

Debugging and monitoring

As noted in The New Stack’s Guide to Serverless Technologies, debugging and monitoring are complicated in serverless because much of the infrastructure is abstracted; as a result, “There is no single, standard metric that developers can rely upon to tell them whether their code is working or not.” In addition, serverless systems are distributed and functions are ephemeral, making logging not only harder to implement but incredibly important. But determining when (and how) applications fail is only the tip of the iceberg. Monitoring is not only needed to keep apps up and running; it provides information about an application’s success, such as page views, ad clicks, sales conversions, and other key performance indicators.

Azion’s Edge Functions provides both fine-grained and big picture analytics to assist with debugging and monitoring. Developers can set up logging through the Fetch API and view real-time information about how their apps are performing through Real Time Metrics.

Performance

One key concern with serverless is a performance issue known as a cold start: a lag of about half a second that occurs when starting up a function that hasn’t been called in a while. Cold starts occur when serverless providers run functions in containers; to conserve resources, the provider spins down containers after a period of inactivity (usually around 30 minutes) and spins them back up when the function is started again. Because cold starts lead to unpredictable performance and high latency, performance is often listed as a drawback of serverless, and many lists of serverless best practices suggest avoiding serverless for services that require low latency and reliable performance.

However, cold starts are only an issue for serverless compute services like AWS Lambda that run in containers. Because Azion does not use containers, Edge Functions can be run in any location with zero cold starts. Rather than spinning up a container and node.js instance when the function has been idle, Azion only needs to load from disk to memory, resulting in a solution that is up to 10x less expensive and up to 100x faster—an ideal solution for reliable, low-latency next-generation applications.

Security

Although containers can lead to unreliable performance, some serverless providers suggest they are necessary to isolate functions and keep them secure. Azion balances security with performance by using V8 Isolate in a multi-tenant environment to create a sandbox that keeps each function isolated and secure.

Vendor lock-in

In his 2020 book What is Serverless, Mike Admunsen points out the different cloud providers’ approach to each of these tasks is one of the key contributors to vendor lock-in among serverless customers. He writes that, “Vendor lock-in happens when competitors solve shared problems in unique ways….When each of the roles provided by serverless platforms is unbundled and offered as interoperable standalone elements, software architects and developers will experience an increase in mobility for their solutions.”

The need for interoperability in serverless platforms is one of the key reasons that Azion has adopted open standards in its Edge Computing Platform. In addition, our Edge Computing products are available separately so that customers can choose only the products they need.

Conclusion

Successfully implementing serverless functions requires both an understanding of best practices for development and the right platform and tools for the job. As a relatively new deployment and architectural model, many serverless platforms are still in the process of maturing. As a result, it’s important to choose a platform that not only meets your business’s current needs, but provides portability and interoperability to ensure that your business can easily connect to and integrate with other services in the future. Designed with open standards and features designed to handle some of the key pain points of serverless customers, Edge Functions is an excellent solution for developers looking to leverage the power of serverless computing.

Subscribe to our Newsletter